Transformer in Touch: A Survey
Gao, Jing, Cheng, Ning, Fang, Bin, Han, Wenjuan
–arXiv.org Artificial Intelligence
The Transformer model, initially achieving significant success in the field of natural language processing, has recently shown great potential in the application of tactile perception. This review aims to comprehensively outline the application and development of Transformers in tactile technology. We first introduce the two fundamental concepts behind the success of the Transformer: the self-attention mechanism and large-scale pre-training. Then, we delve into the application of Transformers in various tactile tasks, including but not limited to object recognition, cross-modal generation, and object manipulation, offering a concise summary of the core methodologies, performance benchmarks, and design highlights. Finally, we suggest potential areas for further research and future work, aiming to generate more interest within the community, tackle existing challenges, and encourage the use of Transformer models in the tactile field.
arXiv.org Artificial Intelligence
May-21-2024
- Country:
- Asia > China (0.28)
- Europe > United Kingdom
- England (0.14)
- Genre:
- Research Report
- Experimental Study (0.46)
- New Finding (0.68)
- Research Report
- Technology:
- Information Technology
- Artificial Intelligence
- Cognitive Science (1.00)
- Machine Learning > Neural Networks
- Deep Learning (1.00)
- Natural Language > Large Language Model (1.00)
- Representation & Reasoning (1.00)
- Robots > Manipulation (0.68)
- Vision (1.00)
- Sensing and Signal Processing > Image Processing (1.00)
- Artificial Intelligence
- Information Technology