Asking machines to make music by themselves is kind of a strange notion. They don't feel happy or hurt, and as far as we know, they don't long for the affections of other machines. Humans like to think of music as being a strictly human thing, a passionate undertaking so nuanced and emotion-based that a machine could never begin to understand the feeling that goes into the process of making music, or even the simple enjoyment of it. The idea of humans and machines having a jam session together is even stranger. But oddly enough, the principles of the jam session may be exactly what machines need to begin to understand musical expression.
Last year, Google released an artificial intelligence kit aimed at makers, with two different flavors: Vision to recognize people and objections, and Voice to create a smart speaker. Now, Google is back with a new version to make it even easier to get started. While this might not be very useful to most Hackaday readers, who probably have a spare Pi (or 5) lying around, this is invaluable for novice makers or the educational market. These audiences now have access to an all-in-one solution to build projects and learn more about artificial intelligence. We've previously seen toys, phones, and intercoms get upgrades with an AIY kit, but would love to see more! [Mike Rigsby] has used one in his robot dog project to detect when people are smiling.
Alasdair Allan is a scientist and researcher who has authored more than 80 peer-reviewed papers and eight books and has been involved with several standards bodies. Originally an astrophysicist, Alasdair now works as a consultant and journalist, focusing on open hardware, machine learning, big data, and emerging technologies, with expertise in electronics, especially wireless devices and distributed sensor networks, mobile computing, and the internet of things. He runs a small consulting company and has written for Make: magazine, Motherboard/VICE, Hackaday, Hackster.io, In the past, he has mesh-networked the Moscone Center, caused a US Senate hearing, and contributed to the detection of what was at the time the most distant object yet discovered.
Back in the early days of the Nintendo Game Boy, it was revolutionary that a full-fledged gaming experience could fit easily into a backpack. The latest evolution of the Game Boy, which was unveiled during a recent Hackaday presentation, doesn't only just fit in a bag or a pocket -- it's small enough to clip right onto your keyring. SEE ALSO: This controller solves the Nintendo NES Classic's biggest problem The hacker created the tiny device using a color OLED screen and an ESP32 microcontroller. The setup includes just about everything you got with an old-school Game Boy: a functional direction pad, buttons, and even speakers for in-game audio. Even with the tiny package, the 2016 tech used to create the keychain Game Boy (or as I'd like to call it, the Key Boy) makes it much cooler than its much bigger predecessors.
I am a robotics engineer with 20 years of experience in the field, both as an industry project lead and an award-winning researcher at MIT and Rice University. I'm a thought leader in multi-robot systems, having led engineering teams in building 2 different swarms with 100 robots each. You can learn more about my previous work as director of the Rice Multi-Robot Systems Lab here: http://mrsl.rice.edu/ Recent research projects include using distributed computational geometry for multi-robot configuration estimation and control, distributed multi-robot manipulation, and deploying physical data structures for computation on multi-robot systems. Previous positions include lead research scientist at iRobot corporation, where McLurkin was the manager of the DARPA-funded Swarm project.