London has some of the greatest musical acts, and researchers at Kingston University and Queen Mary University of London want to make something different.
To spice music with a little bit of inhuman composer, they created 'folk-rnn' which explores the intersection of Artificial Intelligence (AI) and creative arts. The AI is capable of creating music on its own, and it has played it on its first show.
To create music, Oded Ben-Ta and Bob Sturm use Machine learning algorithms. Instead of relying on predefined rules, the two fed folk-rnn with a crowd-sourced repertoire of 23,000 Irish music transcripts.
With the huge amount of datasets, the AI was able to create mathematical representations of the patterns and find correlations.
Since its inception in 2015, folk-rnn has undergone three iterations and has produced more than 100,000 songs, many of which have been compiled in an 14-volume online compendium.
Flow Machines, a project funded by the European Research Council and coordinated by Sony's Computer Science Labs, also applied AI algorithms to music. The most notable achievement is called "Daddy's Car," a song generated by an algorithm that was trained by 40 of The Beatles' hit songs.
First of all, the music is weird.
This is because algorithms are styling the music in a way that they make basic mistakes that human musicians wouldn't do.
"Art is not a well-defined problem, because you never know exactly what you want," said Francois Pachet, the lead researcher at Flow Machines and director of Spotify's Creator Technology Research Lab. "It's good actually that art is not well defined. Otherwise, it would not be art."
The song "Daddy's Car" was edited by a human musician, with some tracks added by hand. "There was pretty much a lot of AI in there, but not everything," Pachet says, "including voice lyrics and structure, and of course the whole mix and production."
"The real benefit is coming up with sequences that aren't expected, and that lead to musically interesting ideas," said Bob Sturm, a lecturer in digital media at Queen Mary, University of London who worked on folk-rnn. "We want the system to create mistakes, but the right kind of mistakes."
“One way in which AI people think about music is as a sequence of notes, and that’s a mistake,” said Oded Ben-Tal, a senior lecturer in music technology at Kingston University and a researcher for folk-rnn. "Music is a social activity, music is a cultural activity, and I think that’s part of the thing of what interests us."
With that in mind, the pair invited a number of musicians to come together for a show called “Partnerships.” a reference to the relationship between human and machine.
In East London at Stepney district, gathered for two hours of traditional Irish music. The show featured a mix of compositions, all performed by humans, with varying levels of input from the AI. Some compositions took the computer’s work as a starting point, some took the project as an inspiration, while others directly played the generated work.
The performance was met with a mixture of fascination, awe, and weirdness. But it was considered to be the exclusive domain of human intelligence - showing more ways that man and machine can cooperate.
The AI folk-rnn is open source. But this project won’t replace composers. What Ben-Tal and Sturm want, is to develop a tool that musicians can use to help in the creative process, creating a new source of inspiration.