. . . And the Computer Plays Along
A concert held at the Massachussetts Institute of Technology (MIT) in the fall to celebrate the opening of the university's new museum included a performer that was invisible to the audience but played a key role in forming the melodic sound: an artificial intelligence (AI) system that responded to the musicians and improvised in real time. In a piece from "Brain Opera 2.0," the system starts by growling to the trumpet, then finds pitches with the trombone, becomes melodic with the sax, and ultimately syncs with the instruments by the time everyone comes in, explains Tod Machover, a music and media professor at MIT and head of the MIT Media Lab, who served as composer/conductor of the two-night concert event. The "living, singing AI" system was designed by Manaswi Mishra, one of Machover's Ph.D. students. "We developed a machine learning-based model that could react to musician input in real time, and then'fed' this model with a vast amount of music from many countries, styles, and historic periods, as well as with all kinds of human voices making every conceivable kind of vocal sound," Machover said. The system also drew from a vast library of percussive instruments and sounds from around the world to then improvise with the performers.
Mar-24-2023, 09:00:16 GMT
- Country:
- Europe
- Netherlands (0.04)
- United Kingdom (0.04)
- North America
- Canada > British Columbia
- Metro Vancouver Regional District > Burnaby (0.04)
- United States
- California (0.04)
- Illinois > Cook County
- Chicago (0.04)
- New York (0.04)
- Canada > British Columbia
- Europe
- Industry:
- Leisure & Entertainment (1.00)
- Media > Music (1.00)
- Technology: