Despite what you might see in movies, today's robots are still very limited in what they can do. They can be great for many repetitive tasks, but their inability to understand the nuances of human language makes them mostly useless for more complicated requests. For example, if you put a specific tool in a toolbox and ask a robot to "pick it up," it would be completely lost. Picking it up means being able to see and identify objects, understand commands, recognize that the "it" in question is the tool you put down, go back in time to remember the moment when you put down the tool, and distinguish the tool you put down from other ones of similar shapes and sizes. Recently researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have gotten closer to making this type of request easier: In a new paper, they present an Alexa-like system that allows robots to understand a wide range of commands that require contextual knowledge about objects and their environments.
In this paper, we discuss the motivation, approach, and status of a new NSF ITR project in which an adaptive working memory is investigated for robot control and learning. There is much evidence for the existence of such a memory structure in primates. Such memory is closely tied to the learning and execution of tasks, as it contributes to decision-making capabilities by focusing on essential task information and discarding distractions. We will integrate the adaptive working memory structure into a robot to explore the issues of task learning in a physical embodiment. This leads to a complex but realistic system involving perceptual systems, actuators, reasoning, and short-term and long-term memory structures. In the paper, we discuss also planned experiments intended to evaluate the utility of the adaptive working memory.
Memory speed isn't always something that one would pay attention to when buying a smartphone, or at least you'd expect the latest flagships to come with the fastest options available at the time, but it turns out that this isn't necessarily true. Recently, some Huawei P10 and P10 Plus users in China noticed that they were only getting eMMC 5.1 memory speeds on their devices. For instance, the sequential read speeds were in the ballpark of 250MB/s on AndroBench, whereas the luckier folks who got UFS 2.0 or 2.1 chips on their phones managed to hit around 550MB/s or even 750MB/s (our very own international unit got 786.67MB/s). Indeed, Huawei never specified the type of flash memory on its P10 spec sheets, which led to speculations that the mobile giant was intentionally misleading consumers. To address this, Huawei Business Group CEO Richard Yu took to Weibo to explain what was going on.
Reacting is influenced often by mentally recognizing objects in which cases the associated features of these objects are recalled. This could be a lengthy process. In humans distinct features are associated with objects as adjectives. In this paper we represent a preliminary research on enabling robots or software agents to use this fact.