Visible to the public Deep Learning Based Video Analytics For Person Tracking

TitleDeep Learning Based Video Analytics For Person Tracking
Publication TypeConference Paper
Year of Publication2020
AuthorsKanna, J. S. Vignesh, Raj, S. M. Ebenezer, Meena, M., Meghana, S., Roomi, S. Mansoor
Conference Name2020 International Conference on Emerging Trends in Information Technology and Engineering (ic-ETITE)
Date PublishedFeb. 2020
PublisherIEEE
ISBN Number978-1-7281-4142-8
Keywordsconvolutional neural network, criminal activity, Deep Learning, deep video, face recognition, Facial features, feature extraction, Gender, identification marks, image classification, learning (artificial intelligence), Metrics, neural nets, object tracking, Pattern recognition, person tracking, police data processing, pubcrawl, resilience, Resiliency, Scalability, search process, security, shirt pattern, spectacle status, surveillance cameras, surveillance video frames, video analytics, video cameras, video frame, video log, video signal processing, video surveillance
Abstract

As the assets of people are growing, security and surveillance have become a matter of great concern today. When a criminal activity takes place, the role of the witness plays a major role in nabbing the criminal. The witness usually states the gender of the criminal, the pattern of the criminal's dress, facial features of the criminal, etc. Based on the identification marks provided by the witness, the criminal is searched for in the surveillance cameras. Surveillance cameras are ubiquitous and finding criminals from a huge volume of surveillance video frames is a tedious process. In order to automate the search process, proposed a novel smart methodology using deep learning. This method takes gender, shirt pattern, and spectacle status as input to find out the object as person from the video log. The performance of this method achieves an accuracy of 87% in identifying the person in the video frame.

URLhttps://ieeexplore.ieee.org/document/9077766
DOI10.1109/ic-ETITE47903.2020.173
Citation Keykanna_deep_2020