Real-time Addressee Estimation: Deployment of a Deep-Learning Model on the iCub Robot

Mazzola, Carlo, Rea, Francesco, Sciutti, Alessandra

arXiv.org Artificial Intelligence 

Aiming at implementing AE skills in robots to let them Focusing on the perceptual domain, i.e., a passive agent interact in unstructured scenarios, this paper 1) describes the listening to humans, the artificial agents must be able to development of an AE deep-learning model trained on humanrobot detect voices (Sound Detection and Voice Recognition), recognize interaction (HRI) dataset, as already described in [16], 2) who is talking (Speaker Recognition and Speaker illustrates its first deployment on the humanoid robot iCub, and Localization), and what they are saying (Natural Language 3) reports the results of an HRI pilot experiment to evaluate Understanding). But even considering optimal performances in the performance of the model deployed on the iCub compared all these tasks, an artificial agent endowed with such abilities to previous tests made on the training dataset.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found