Action and intention recognition of pedestrians in urban traffic
2018 (English)In: 2018 14th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS) / [ed] Gabriella Sanniti di Baja, Luigi Gallo, Kokou Yetongnon, Albert Dipanda, Modesto Castrillón-Santana & Richard Chbeir, Piscataway, N.J.: IEEE, 2018, p. 676-682Conference paper, Published paper (Refereed)
Abstract [en]
Action and intention recognition of pedestrians in urban settings are challenging problems for Advanced Driver Assistance Systems as well as future autonomous vehicles to maintain smooth and safe traffic. This work investigates a number of feature extraction methods in combination with several machine learning algorithms to build knowledge on how to automatically detect the action and intention of pedestrians in urban traffic. We focus on the motion and head orientation to predict whether the pedestrian is about to cross the street or not. The work is based on the Joint Attention for Autonomous Driving (JAAD) dataset, which contains 346 videoclips of various traffic scenarios captured with cameras mounted in the windshield of a car. An accuracy of 72% for head orientation estimation and 85% for motion detection is obtained in our experiments.
Place, publisher, year, edition, pages
Piscataway, N.J.: IEEE, 2018. p. 676-682
Keywords [en]
Action Recognition, Intention Recognition, Pedestrian, Traffic, Driver Assistance
National Category
Signal Processing
Identifiers
URN: urn:nbn:se:hh:diva-38504DOI: 10.1109/SITIS.2018.00109ISI: 000469258400098Scopus ID: 2-s2.0-85065906502ISBN: 978-1-5386-9385-8 (electronic)ISBN: 978-1-5386-9386-5 (print)OAI: oai:DiVA.org:hh-38504DiVA, id: diva2:1268689
Conference
The 14th International Conference on Signal Image Technology & Internet Based Systems (SITIS), Hotel Reina Isabel, Las Palmas de Gran Canaria, Spain, 26-29 November, 2018
Projects
SIDUS AIR
Funder
Knowledge Foundation, 20140220Swedish Research CouncilVinnova
Note
Funding: This work is financed by the SIDUS AIR project of the Swedish Knowledge Foundation under the grant agreement number 20140220. Author F. A.-F. also thanks the Swedish Research Council (VR), and the Sweden’s innovation agency (VINNOVA).
2018-12-062018-12-062020-02-03Bibliographically approved