Multimodal Classification of Urban Micro-Events

Open Access
Authors
Publication date 2019
Book title MM'19
Book subtitle proceedings of the 27th ACM Conference on Multimedia : October 21-25, 2019, Nice, France
ISBN (electronic)
  • 9781450368896
  • 9781450367936
Event 27th ACM International Conference on Multimedia, MM 2019
Pages (from-to) 1455-1463
Publisher New York, NY: Association for Computing Machinery
Organisations
  • Faculty of Economics and Business (FEB) - Amsterdam Business School Research Institute (ABS-RI)
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
  • Faculty of Science (FNWI)
Abstract
In this paper we seek methods to effectively detect urban micro- events. Urban micro-events are events which occur in cities, have limited geographical coverage and typically affect only a small group of citizens. Because of their scale these events are difficult to identify in most data sources. However, by using citizen sensing to gather data, detecting them becomes feasible. The data gathered by citizen sensing is often multimodal and, as a consequence, the in- formation required to detect urban micro-events is distributed over multiple modalities. This makes it essential to have a classifier ca- pable of combining them. In this paper we explore several methods of creating such a classifier, including early, late and hybrid fusion as well as representation learning using multimodal graphs. We evaluate performance in terms of accurate classification of urban micro-events on a real world dataset obtained from a live citizen re- porting system. We show that a multimodal approach yields higher performance than unimodal alternatives. Furthermore, we demon- strate that our hybrid combination of early and late fusion with multimodal embeddings outperforms our other fusion methods.
Document type Conference contribution
Language English
Published at https://doi.org/10.1145/3343031.3350967
Downloads
3343031.3350967 (Final published version)
Permalink to this page
Back