arXiv Analytics

Sign in

arXiv:2406.02578 [cs.LG]AbstractReferencesReviewsResources

Pretrained Mobility Transformer: A Foundation Model for Human Mobility

Xinhua Wu, Haoyu He, Yanchao Wang, Qi Wang

Published 2024-05-29Version 1

Ubiquitous mobile devices are generating vast amounts of location-based service data that reveal how individuals navigate and utilize urban spaces in detail. In this study, we utilize these extensive, unlabeled sequences of user trajectories to develop a foundation model for understanding urban space and human mobility. We introduce the \textbf{P}retrained \textbf{M}obility \textbf{T}ransformer (PMT), which leverages the transformer architecture to process user trajectories in an autoregressive manner, converting geographical areas into tokens and embedding spatial and temporal information within these representations. Experiments conducted in three U.S. metropolitan areas over a two-month period demonstrate PMT's ability to capture underlying geographic and socio-demographic characteristics of regions. The proposed PMT excels across various downstream tasks, including next-location prediction, trajectory imputation, and trajectory generation. These results support PMT's capability and effectiveness in decoding complex patterns of human mobility, offering new insights into urban spatial functionality and individual mobility preferences.

Related articles: Most relevant | Search more
arXiv:2211.04878 [cs.LG] (Published 2022-11-09)
Foundation Models for Semantic Novelty in Reinforcement Learning
arXiv:2409.09894 [cs.LG] (Published 2024-09-15)
Estimating Wage Disparities Using Foundation Models
arXiv:2012.02825 [cs.LG] (Published 2020-12-04)
Deep Learning for Human Mobility: a Survey on Data and Models