The new artificial composers

In recent years, artificial intelligence (AI) has increasingly affected music, for example in music production: while a human writes the main melody, the machine can produce the background arrangement using of AI.

“However, music produced by AI today is not often very surprising, because surprise is not the priority of AI. Rather, it is about giving you what you command”, Cagri Erdem said.

As a researcher at RITMO Center for Interdisciplinary Studies in Rhythm, Time and MovementUniversity of Oslo, he wanted the machine to become a partner.

“I wanted us to make music together. So the machine must have some sort of agency, an ability to act on its own,” he explains.

The machine as a partner

He recently completed a PhD where he studied what he called “shared control”, where he developed several interactive systems.

One of the systems he named CAVI. It is an instrument controlled by both a musician and a machine, and where both actors are able to make choices.


Cağrı Erdem and six stand-alone guitars play assisted by CAVI. (Video: YouTube)

“In this clip, six ‘self-playing guitars’ listen and react to what they hear. It becomes a kind of guitar chorus, where the guitars make their own contribution to the piece,” says Erdem.

In November, Erdem will organize a workshop on musical AI.

body movements

He also studied how our body can collaborate with AI.

Cağrı Erdem defended his thesis Control or be controlled? Exploring embodiment, agency and artificial intelligence in interactive musical performance in May 2022.

“If you look at the AI ​​systems in music today, they don’t collaborate with the human body. In general, they mostly base their actions primarily on sound. However, when humans play together, they communicate to the both through sound and movement,” explains Erdem.

He invited 36 guitarists to his lab and collected a dataset that gave insight into how their movements related to the music being played.

“I discovered that there is a close connection between sound and movement, especially the movements of the right hand. The muscle force we exert when we hit the guitar strings almost perfectly mirrors the sound,” he says.

Erdem used machine learning algorithms to encode motion and sound. Later, he could give the machine motion-only information, and the machine would produce sound based on that input.


His ‘Playing in the ‘air’ system is now able to make music while he plays air guitar. (Video: YouTube)

In the longer term, Erdem believes this type of technology can facilitate collaboration between humans and machines.

Instruments for the future

Erdem works in a niche field, but he thinks his research is important.

“In the history of music, there are many examples of new instruments that influenced current music,” he says.

When the piano was invented, it was originally called “pianoforte” because it allowed you to play both quietly (Italian: piano) and loudly (Italian: forte).

“You see the effect of that on the plays written afterwards. Tools affect the music we make. New instruments can lay the foundation for the music of the future,” says Erdem.

A more creative AI

Today, the field of technology is led by engineers, not artists, and this is very visible in today’s AI systems, he adds.

“If you type the word ‘cat’ into a search engine, you get images based on other images of cats. That makes sense, but it also means the algorithm often doesn’t show you photos of cats. rare,” says Erdem.

For AI to create broader musical expressions, more artists need to get their hands dirty with AI technologies, he says.

“People working at the intersection of art, technology and science, like me, need to explore how algorithms can help expand the art and music of the future,” he says.

AI in all phases of music production

As a tool, AI already contributes enormously to people’s everyday musical experiences. For example, when your streaming platform suggests artists to you, it uses AI. When you listen to movie music with large orchestra arrangements, they can be created by AI, perhaps based on a simple melody.

Erdem has no doubt that in 50 years, AI will be very present in music production.

“I think it will be indispensable in all phases of music production, such as sound synthesis, writing/composition, arrangement, recording, mixing, mastering, distribution/streaming, promotion, as well as live performances,” he says.

He also thinks we will find AI musicians with huge fanbases. The robot Simon is already there. He doesn’t have a fan base yet, but you can book him for your event.

“But what will happen to copyright? We do not know yet. For example, after the premier of CAVI, we couldn’t figure out how to deal with this in the Norwegian system,” Erdem said.

Reference:

Cagri Erdem. Control or be controlled? Exploring embodiment, agency and artificial intelligence in interactive music performance, Doctoral thesis, University of Oslo, 2022. Summary.

Comments are closed.