A Framework For Automated Analysis of Surrogate Measures of Safety from Video using Deep Learning Techniques

Morten Bornø Jensen, Martin Ahrnbom, Maarten Kruithof, Kalle Åström, Mikael Nilsson, Håkan Ardö, Aliaksei Laureshyn, Carl Johnsson, Thomas B. Moeslund

Publikation: Bidrag til tidsskriftKonferenceartikel i tidsskriftForskningpeer review


Traffic surveillance and monitoring are gaining a lot of attention as a result of an
increase of vehicles on the road and a desire to minimize accidents. In order to minimize
accidents and near-accidents, it is important to be able to judge the safety of a
traffic environment. It is possible to perform traffic analysis using large quantities
of video data. Computer vision is a great tool for reducing the data, so that only sequences
of interest are further analyzed. In this paper, we propose a cross-disciplinary
framework for performing automated traffic analysis, from both a computer vision researcher’s
and traffic researcher’s point-of-view. Furthermore, we present STRUDL,
an open-source implementation of this framework, that computes trajectories of road
users, which we use to automatically find sequences containing critical events of
vehicles and vulnerable road users in an traffic intersection, which is an otherwise
time-consuming task.
Keywords: Computer vision, data reduction, computer aided analysis, deep learning,
surveillance, tracking, detection, traffic analysis
TidsskriftTransportation Research Board. Annual Meeting Proceedings
Sider (fra-til)281-306
StatusUdgivet - jan. 2019
BegivenhedTransportation Research Board (TRB) 98th Annual Meeting - Walter E. Washington Convention Center, Washington, D.C, USA
Varighed: 13 jan. 201917 jan. 2019


KonferenceTransportation Research Board (TRB) 98th Annual Meeting
LokationWalter E. Washington Convention Center
ByWashington, D.C

Fingeraftryk Dyk ned i forskningsemnerne om 'A Framework For Automated Analysis of Surrogate Measures of Safety from Video using Deep Learning Techniques'. Sammen danner de et unikt fingeraftryk.