arXiv Analytics

Sign in

arXiv:1511.06425 [cs.CV]AbstractReferencesReviewsResources

First Step toward Model-Free, Anonymous Object Tracking with Recurrent Neural Networks

Quan Gan, Qipeng Guo, Zheng Zhang, Kyunghyun Cho

Published 2015-11-19Version 1

In this paper, we propose and study a novel visual object tracking approach based on convolutional networks and recurrent networks. The proposed approach is distinct from the existing approaches to visual object tracking, such as filtering-based ones and tracking-by-detection ones, in the sense that the tracking system is explicitly trained off-line to track anonymous objects in a noisy environment. The proposed visual tracking model is end-to-end trainable, minimizing any adversarial effect from mismatches in object representation and between the true underlying dynamics and learning dynamics. We empirically show that the proposed tracking approach works well in various scenarios by generating artificial video sequences with varying conditions; the number of objects, amount of noise and the match between the training shapes and test shapes.

Related articles: Most relevant | Search more
arXiv:2010.15740 [cs.CV] (Published 2020-10-29)
Recurrent Neural Networks for video object detection
arXiv:1312.4569 [cs.CV] (Published 2013-11-05, updated 2014-03-10)
Dropout improves Recurrent Neural Networks for Handwriting Recognition
arXiv:1704.04055 [cs.CV] (Published 2017-04-13)
Land Cover Classification via Multi-temporal Spatial Data by Recurrent Neural Networks