Relax, it doesn't matter how you get there: A new self-supervised approach for multi-timescale behavior analysis
Azabou, Mehdi, Mendelson, Michael, Ahad, Nauman, Sorokin, Maks, Thakoor, Shantanu, Urzay, Carolina, Dyer, Eva L.
–arXiv.org Artificial Intelligence
Natural behavior consists of dynamics that are complex and unpredictable, especially when trying to predict many steps into the future. While some success has been found in building representations of behavior under constrained or simplified task-based conditions, many of these models cannot be applied to free and naturalistic settings where behavior becomes increasingly hard to model. In this work, we develop a multi-task representation learning model for behavior that combines two novel components: (i) an action-prediction objective that aims to predict the distribution of actions over future timesteps, and (ii) a multi-scale architecture that builds separate latent spaces to accommodate short-and long-term dynamics. After demonstrating the ability of the method to build representations of both local and global dynamics in realistic robots in varying environments and terrains, we apply our method to the MABe 2022 Multi-agent behavior challenge, where our model ranks first overall (top rank over all 13 tasks), first on all sequence-level tasks, and 1st or 2nd on 7 out of 9 frame-level tasks. In all of these cases, we show that our model can build representations that capture the many different factors that drive behavior and solve a wide range of downstream tasks.
arXiv.org Artificial Intelligence
Mar-15-2023
- Genre:
- Research Report (1.00)
- Industry:
- Health & Medicine > Therapeutic Area > Neurology (0.46)
- Technology:
- Information Technology > Artificial Intelligence
- Machine Learning > Neural Networks
- Deep Learning (0.93)
- Natural Language (1.00)
- Representation & Reasoning > Agents (0.69)
- Robots (1.00)
- Machine Learning > Neural Networks
- Information Technology > Artificial Intelligence