Title
Novel anomalous event detection based on human-object interactions
Date Issued
01 January 2018
Access level
open access
Resource Type
conference paper
Author(s)
Universidad Federal de Minas Gerais
Universidad Federal de Ouro Preto
Publisher(s)
SciTePress
Abstract
This study proposes a novel approach to anomalous event detection that collects information from a specific context and is flexible enough to work in different scenes (i.e., the camera does need to be at the same location or in the same scene for the learning and test stages of anomaly event detection), making our approach able to learn normal patterns (i.e., patterns that do not entail an anomaly) from one scene and be employed in another as long as it is within the same context. For instance, our approach can learn the normal behavior for a context such the office environment by watching a particular office, and then it can monitor the behavior in another office, without being constrained to aspects such as camera location, optical flow or trajectories, as required by the current works. Our paradigm shift anomalous event detection approach exploits human-object interactions to learn normal behavior patterns from a specific context. Such patterns are used afterwards to detect anomalous events in a different scene. The proof of concept shown in the experimental results demonstrate the viability of two strategies that exploit this novel paradigm to perform anomaly detection.
Start page
293
End page
300
Volume
5
Language
English
OCDE Knowledge area
Otras ingenierías y tecnologías
Ciencias de la computación
Subjects
Scopus EID
2-s2.0-85047816697
Resource of which it is part
VISIGRAPP 2018 - Proceedings of the 13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications
ISBN of the container
978-989758290-5
Conference
13th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, VISIGRAPP 2018
Sponsor(s)
The authors would like to thank the Brazilian National Research Council – CNPq (Grant #311053/2016-5), the Minas Gerais Research Foundation – FAPEMIG (Grants APQ-00567-14 and PPM-00540-17) and the Coordination for the Improvement of Higher Education Personnel – CAPES (DeepEyes Project).
Sources of information:
Directorio de Producción Científica
Scopus