Mind-reading A.I. algorithm can work out what music is playing in your head
Most of us have used apps like Shazam, which can identify songs when we hold up our phone up to a speaker. But what if it was possible for an app to identify a piece of music based on nothing more than your thought patterns. Perhaps not, according to a new piece of research carried out by investigators at the University of California, Berkeley. In 2014, researcher Brian Pasley and colleagues used a deep-learning algorithm and brain activity, measured with electrodes, to turn a person's thoughts into digitally synthesized speech. This was achieved by analyzing a person's brain waves while they were speaking in order to decode the link between speech and brain activity.
Apr-19-2018, 21:10:46 GMT
- Country:
- North America > United States > California > Alameda County > Berkeley (0.26)
- Genre:
- Research Report (0.38)
- Industry:
- Health & Medicine > Therapeutic Area > Neurology (0.67)
- Technology: