If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
The Imagination, Computation, and Expression Laboratory at MIT's Computer Science and Artificial Intelligence Laboratory has released a new video game called Grayscale, which is designed to sensitize players to problems of sexism, sexual harassment, and sexual assault in the workplace. D. Fox Harrell, the lab's director, and students in his course CMS.628 (Advanced Identity Representation) completed the initial version of the game more than a year ago, and the ICE Lab has been working on it consistently since. But it addresses many of the themes brought to the fore by the recent #MeToo movement. The game is built atop the ICE Lab's Chimeria computational platform, which was designed to give computer systems a more subtle, flexible, and dynamic model of how humans categorize members of various groups. MIT News spoke to Harrell, a professor of digital media and artificial intelligence, about Grayscale (or to give it its more formal name, Chimeria:Grayscale). Q: How does the game work?
Working memory is a sort of "mental sketchpad" that allows you to accomplish everyday tasks such as calling in your hungry family's takeout order and finding the bathroom you were just told "will be the third door on the right after you walk straight down that hallway and make your first left." It also allows your mind to go from merely responding to your environment to consciously asserting your agenda. "Working memory allows you to choose what to pay attention to, choose what you hold in mind, and choose when to make decisions and take action," says Earl K. Miller, the Picower Professor in MIT's Picower Institute for Learning and Memory and the Department of Brain and Cognitive Sciences. "It's all about wresting control from the environment to your own self. Once you have something like working memory, you go from being a simple creature that's buffeted by the environment to a creature that can control the environment."
The Department of Electrical Engineering and Computer Science (EECS) has announced the appointment of two new associate department heads, and the creation of the new role of associate department head for strategic directions. Professors Saman Amarasinghe and Joel Voldman have been named as new associate department heads, effective immediately, says EECS Department Head Asu Ozdaglar. Ozdaglar became department head on Jan. 1, replacing Anantha Chandrakasan, who is now dean of the School of Engineering. Professor Nancy Lynch will be the inaugural holder of the new position of associate department head for strategic directions, overseeing new academic and research initiatives. "I am thrilled to be starting my own new role in collaboration with such a strong leadership team," says Ozdaglar, who is also the Joseph F. and Nancy P. Keithley Professor of Electrical Engineering and Computer Science.
This spring, the MIT School of Science welcomes three new professors in the departments of Brain and Cognitive Sciences, and Earth, Atmospheric and Planetary Sciences. Michael Halassa aims to understand the neural basis of cognitive control and flexibility, particularly as it relates to attention and decision making. To study these questions, he has developed behavioral models of cognitive function in mice, allowing him to probe the underlying neural circuits and computations using parametric behavior, electrophysiological recordings, and causal manipulations. His major current focus is understanding the function of the thalamus, traditionally considered a relay station for sending sensory information to the cortex. Halassa is also a board-certified psychiatrist with fellowship training in psychotic disorders.
Using electrodes made of carbon nanotubes (CNTs) can significantly improve the performance of devices ranging from capacitors and batteries to water desalination systems. But figuring out the physical characteristics of vertically aligned CNT arrays that yield the most benefit has been difficult. Now an MIT team has developed a method that can help. By combining simple benchtop experiments with a model describing porous materials, the researchers have found they can quantify the morphology of a CNT sample, without destroying it in the process. In a series of tests, the researchers confirmed that their adapted model can reproduce key measurements taken on CNT samples under varying conditions.
When you enter a room, your brain is bombarded with sensory information. If the room is a place you know well, most of this information is already stored in long-term memory. However, if the room is unfamiliar to you, your brain creates a new memory of it almost immediately. MIT neuroscientists have now discovered how this occurs. A small region of the brainstem, known as the locus coeruleus, is activated in response to novel sensory stimuli, and this activity triggers the release of a flood of dopamine into a certain region of the hippocampus to store a memory of the new location.
For the past 10 years, the Camera Culture group at MIT's Media Lab has been developing innovative imaging systems -- from a camera that can see around corners to one that can read text in closed books -- by using "time of flight," an approach that gauges distance by measuring the time it takes light projected into a scene to bounce back to a sensor. In a new paper appearing in IEEE Access, members of the Camera Culture group present a new approach to time-of-flight imaging that increases its depth resolution 1,000-fold. That's the type of resolution that could make self-driving cars practical. The new approach could also enable accurate distance measurements through fog, which has proven to be a major obstacle to the development of self-driving cars. At a range of 2 meters, existing time-of-flight systems have a depth resolution of about a centimeter.
Whether it's tracking brain activity in the operating room, seismic vibrations during an earthquake, or biodiversity in a single ecosystem over a million years, measuring the frequency of an occurrence over a period of time is a fundamental data analysis task that yields critical insight in many scientific fields. But when it comes to analyzing these time series data, researchers are limited to looking at pieces of the data at a time to assemble the big picture, instead of being able to look at the big picture all at once. In a new study, MIT researchers have developed a novel approach to analyzing time series data sets using a new algorithm, termed state-space multitaper time-frequency analysis (SS-MT). SS-MT provides a framework to analyze time series data in real-time, enabling researchers to work in a more informed way with large sets of data that are nonstationary, i.e. when their characteristics evolve over time. It allows researchers to not only quantify the shifting properties of data but also make formal statistical comparisons between arbitrary segments of the data.
Last month, three MIT materials scientists and their colleagues published a paper describing a new artificial-intelligence system that can pore through scientific papers and extract "recipes" for producing particular types of materials. That work was envisioned as the first step toward a system that can originate recipes for materials that have been described only theoretically. Now, in a paper in the journal npj Computational Materials, the same three materials scientists, with a colleague in MIT's Department of Electrical Engineering and Computer Science (EECS), take a further step in that direction, with a new artificial-intelligence system that can recognize higher-level patterns that are consistent across recipes. For instance, the new system was able to identify correlations between "precursor" chemicals used in materials recipes and the crystal structures of the resulting products. The same correlations, it turned out, had been documented in the literature.
Researchers at Lincoln Laboratory have been using silicon and compound semiconductor substrates to build photonic integrated circuits, or PICs, that enable devices such as optical communication receivers, wideband ladar transmitters, interconnects for trapped-ion quantum computers, inertial sensors, and microwave signal processors. Now, a recently awarded state grant will fund a germanium deposition reactor that will allow the researchers to exploit germanium as a key optoelectronic material in the fabrication of PICs operating at nontraditional wavelengths and under harsh environmental conditions. Photonic integrated circuits are in demand for routing the enormous volume of traffic passing through data centers today. In addition, high-speed, reliable photonic circuits that lessen systems' electrical power requirements can improve the performance of quantum and all-optical computing systems, as well as the throughput of the advanced microprocessors embedded in highly sensitive sensors and increasingly capable autonomous vehicles. "The addition of the reactor will allow Lincoln Laboratory to manufacture trusted silicon photonic integrated circuits," says Daniel Pulver, the manager of Lincoln Laboratory's Microelectronics Laboratory.