Invited Talks

Hamilton, Carol (Association for the Advancement of Artificial Intelligence)

AAAI Conferences 

Most approaches to semantics in computational linguistics represent meaning in terms of words or abstract symbols. Grounded-language research bases the meaning of natural language on perception and/or action in the (real or virtual) world. Machine learning has become the most effective approach to constructing natural-language systems; however, current methods require a great deal of laboriously annotated training data. Ideally, a computer would be able to acquire language like a child, by being exposed to language in the context of a relevant but ambiguous environment, thereby grounding its learning in perception and action. We will review recent research in grounded language learning and discuss future directions.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found