Applying mmWave Radar Sensors to Vocabulary-Level Dynamic Chinese Sign Language Recognition for the Community With Deafness and Hearing Loss

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 24
  • Download : 0
To facilitate human-computer interaction (HCL) for the community with deafness and hearing loss (D&HL), this article explored the feasibility of recognizing a vocabulary of dynamic Chinese sign language (CSL) based on millimeter-wave (mmWave) radar sensors within the scope of data science. Fundamental problems that challenge its applications on computers and other electronic devices were addressed from multidisciplinary aspects including how to capture signs with an mmWave radar sensor, how to characterize signs using mmWave, and how to recognize signs based on the extracted features. Accordingly, this article proposed tentative solutions for the key concerns on this topic. A case study was later given by constructing a lightweight attentive augmented convolutional neural network (CNN) to classify 15 Chinese sign words based on radar spectrograms. The network achieved a 98.82% weight reduction and a 3.7% accuracy improvement over the original ResNet-18. Furthermore, a deep convolutional generated adversarial network for data augmentation was used to alleviate the conflicts between large numbers of parameters and small data sample sizes. This article shows the keenness of the authors to witness more practical developments in this new area with interdisciplinary collaboration and evolutionary process. © 2023 IEEE.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2023-11
Language
English
Article Type
Article
Citation

IEEE SENSORS JOURNAL, v.23, no.22, pp.27273 - 27283

ISSN
1530-437X
DOI
10.1109/JSEN.2023.3324369
URI
http://hdl.handle.net/10203/316129
Appears in Collection
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0