The capability of human identification in specific scenarios and in a quickly and accurately manner, is a critical aspect in various surveillance applications. In particular, in this context, classical survaillance systems are based on videocameras, requiring high computational/storing resources, which are very sensitive to light and weather conditions. In this paper, an efficient classifier based on deep learning is used for the purpose of identifying individuals features by resorting to the micro-Doppler data extracted from low-power frequency-modulated continuous-wave radar measurements. Results obtained through the application of a deep temporal convolutional neural networks confirms the applicability of deep learning to the problem at hand. Best obtained identification accuracy is 0.949 with an F-measure of 0.88 using a temporal window of four seconds.

Gait recognition using FMCW Radar and Temporal Convolutional Deep Neural Networks

Bernardi M. L.;
2020-01-01

Abstract

The capability of human identification in specific scenarios and in a quickly and accurately manner, is a critical aspect in various surveillance applications. In particular, in this context, classical survaillance systems are based on videocameras, requiring high computational/storing resources, which are very sensitive to light and weather conditions. In this paper, an efficient classifier based on deep learning is used for the purpose of identifying individuals features by resorting to the micro-Doppler data extracted from low-power frequency-modulated continuous-wave radar measurements. Results obtained through the application of a deep temporal convolutional neural networks confirms the applicability of deep learning to the problem at hand. Best obtained identification accuracy is 0.949 with an F-measure of 0.88 using a temporal window of four seconds.
2020
978-1-7281-6636-0
Deep Learning
Gait Recognition
Low-power radar
Micro-Doppler
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12070/60179
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? ND
social impact