By now, most people who run small and midsize businesses know that they ought to take advantage of artificial intelligence to make their companies competitive in the digital age. But many don't know how to go about it. To meet this demand, Northeastern, which invested $50 million in a new artificial intelligence research institute last year, will lead a new Massachusetts program, AI Jump Start, to connect small business owners in the state with academic faculty experts to learn how machine learning can grow their companies. The initiative is aimed at a broad range of small and midsize enterprises in defense, manufacturing, health, and other industries whose leaders would like to incorporate artificial intelligence but aren't quite sure where to turn. It's open also to companies that want to upgrade data-driven computing to glean new insights into customers, suppliers, and competitors.
Dr Kate Darling is a research specialist in human-robot interaction, robot ethics and intellectual property theory and policy at the Massachusetts Institute of Technology (MIT) Media Lab. In her new book, The New Breed, she argues that we would be better prepared for the future if we started thinking about robots and artificial intelligence (AI) like animals. What is wrong with the way we think about robots? So often we subconsciously compare robots to humans and AI to human intelligence. The comparison limits our imagination.
Let's say, just hypothetically, that a surveillance robot styled after a dog was giving you a hard time. In this situation, you'd want to shut the thing down, and quickly. Thankfully, when it comes to Boston Dynamic's Spot robot, there are several ways to do just that. The robots, marketed for industrial use and used for viral hijinks, evoke a robot dystopia in the public imagination -- a fact compounded by an April viral video of the NYPD trotting out its very own customized Spot. The first reported instance of police using Spot was in November of 2019, when the Massachusetts State Police leased at least one of the robots for a three-month trial period.
Microsoft is to buy the artificial intelligence and speech technology firm Nuance Communications for about $16bn (£12bn), as it builds up its cloud-computing operation for healthcare and business customers. Nuance, known for pioneering speech technology and helping to launch Apple's virtual assistant, Siri, operates in 28 countries and reported revenues of $1.5bn in its last full financial year. The Massachusetts-based company said it served 77% of US hospitals, providing services including clinical speech recognition, medical transcription and medical imaging. The deal comes after the companies went into partnership in 2019 to automate clinical administrative work such as documentation. Microsoft's offer of $56 a share represents a premium of 22.86% on Nuance's most recent closing price.
Scientists in the US have brought the structure of a spider web to life by translating it into music – a technique that could help us communicate with spiders, they say. They assigned different frequencies of sound to strands of the web, creating'notes' that they combined in patterns, based on the web's 3D structure, to generate melodies. The eerie piece of music, which lasts just over a minute, sounds like the soundtrack for an eerie dystopian sci-fi horror film. It was created by researchers at Massachusetts Institute of Technology (MIT) with laser scanning technology and image processing tools. The experts say spider webs could provide a new source for musical inspiration and provide a form of cross-species communication.
Last fall, the MIT Stephen A. Schwarzman College of Computing embarked on a project to design and construct a new building on Vassar Street in Cambridge, at the former site of Building 44. Working with Skidmore, Owings & Merrill (SOM), the design for the new building is taking shape, with plans for the exterior façade now complete. The proposed project will establish a home for the MIT Schwarzman College of Computing, providing state-of-the-art space for computing research and education. The building's central location in the Vassar Street block between Main Street and Massachusetts Avenue will help form a new cluster of connectivity, and will enable the space to have a multifaceted role. The project has been reviewed extensively with city planning staff and will be presented to the Cambridge Planning Board for review and approval.
The Massachusetts College of Art and Design is holding a virtual auction to help support student scholarships. The silent auction portion of the fundraising event ends at noon on April 11, while a live auction takes place online on the evening of April 10. The 32nd annual MassArt Auction is the second conducted online due to the pandemic. Artists who have had their works juried into the auction donate either 50% or 100% of the sale price to support MassArt scholarships. The two auctions will feature over 300 works from MassArt students, graduates, members of the faculty and others.
At some point in your life, you've probably used a combination of sight and touch to find something hidden beneath your couch cushions. And for a while now, robotics researchers have tried to give their creations that same capability. Back in 2019, a team of scientists from the Massachusetts Institute of Technology (MIT) used a combination of tactile sensors and AI to allow a robot to identify objects by touch. A separate group of scientists from MIT has now built a machine that can find things it can't see initially. The aptly named RF Grasp depends on a wrist-mounted camera and an RF reader to hone in and pick up an object.
Prof. Newton Howard, a Brain and Cognitive Scientist, the former Director of the MIT Mind Machine Project at the Massachusetts Institute of Technology and currently a Professor of Computational Neuroscience and Functional Neurosurgery at the University of Oxford, where he directs the Oxford Computational Neuroscience Laboratory participates in Risk Roundup to discuss "Mind Control Technology". Since the beginning of times, we humans have been creating tools to help us interact with the world around us. Now we are moving inwards and developing the tools to help us communicate with the world inside us. While the nature of tools has evolved from physical to digital, and now neural, our brain is effectively becoming the tool for interaction, communication, collaboration, and control. From electrode in many different shapes being implanted in the human brain to transmit and receive signals to non-invasive devices that translate brain waves into commands that control not only computer but also body parts are already becoming a reality.
A number of artificial intelligence-powered tools today help spot grammatical and factual errors in online books, webpages and news articles. In June 2020, Facebook said it would display warning labels if users chose to share COVID-19 articles older than 3 months on the platform. Now, researchers at the Massachusetts Institute of Technology (MIT) have devised a machine-learning (ML) model that will monitor updates to news articles and suggest edits to irrelevant and unverified information. It uses deep learning to verify edits and updates related texts, the team noted in a study titled'Get Your Vitamin C! Robust Fact Verification for Contrastive Evidence'. They examined edits to popular Wikipedia pages.