When designing sounds using audio synthesisers, musicians require a degree of scientific or experiential knowledge to realise their intentions directly and quickly. Often the relationship between the configuration of buttons/sliders on the synthesiser interface and sound that is produced can be unpredictable and unintuitive.
If a musician has a sound that they would like to play on their synthesiser, how can we go about searching for the configuration of buttons/sliders that most accurately reproduce this target sound? As a computer scientist this is a fascinating problem as the topology of the problem space varies dramatically from sound to sound and from one synthesiser to the next.
Using specialised algorithms inspired by the processes of Darwinian evolution and, in particular, the conditions that lead to speciation, I tried to automate this process with some really encouraging results.
It won’t be long before computers are able to run this search in a matter of seconds, which might help musicians to easily explore the diverse sound spaces lurking within their synthesisers…