Pre-trained Transformer Uncovers Meaningful Patterns in Human Mobility Data
–arXiv.org Artificial Intelligence
We empirically demonstrate that a transformer pre-trained on country-scale unlabeled human mobility data learns embeddings capable, through fine-tuning, of developing a deep understanding of the target geography and its corresponding mobility patterns. Utilizing an adaptation framework, we evaluate the performance of our pre-trained embeddings in encapsulating a broad spectrum of concepts directly and indirectly related to human mobility. This includes basic notions, such as geographic location and distance, and extends to more complex constructs, such as administrative divisions and land cover. Our extensive empirical analysis reveals a substantial performance boost gained from pre-training, reaching up to 38% in tasks such as tree-cover regression. We attribute this result to the ability of the pre-training to uncover meaningful patterns hidden in the raw data, beneficial for modeling relevant Figure 1: A transformer pre-trained from scratch on countryscale high-level concepts. The pre-trained embeddings emerge as robust unlabeled human mobility data is adapted to model a representations of regions and trajectories, potentially valuable for variety of high-level concepts manifesting at different levels a wide range of downstream applications.
arXiv.org Artificial Intelligence
Jun-6-2024
- Country:
- Asia
- China > Heilongjiang Province
- Daqing (0.04)
- Japan > Honshū
- Chūbu > Aichi Prefecture (0.04)
- Kansai
- Kyoto Prefecture > Kyoto (0.04)
- Shiga Prefecture (0.04)
- Kantō
- Kanagawa Prefecture (0.14)
- Tokyo Metropolis Prefecture > Tokyo (0.14)
- China > Heilongjiang Province
- North America > United States
- New York (0.04)
- Asia
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Information Technology (0.46)
- Technology: