Deep Neural Network-based Speaker-Aware Information Logging for Augmentative and Alternative Communication
DOI:
https://doi.org/10.37965/jait.2021.0017Keywords:
augmentative and alternative communication (AAC), outcome measures, visual logs, hand tracking, deep learningAbstract
People with complex communication needs can use a high-technology augmentative and alternative communication device to communicate with others. Currently, researchers and clinicians often use data logging from high-tech augmentative and alternative communication devices to analyze augmentative and alternative communication user performance. However, existing automated data logging systems cannot differentiate the authorship of the data log when more than one user accesses the device. This issue reduces the validity of the data logs and increases the difficulties of performance analysis. Therefore, this paper presents a solution using a deep neural network-based visual analysis approach to process videos to detect different augmentative and alternative communication users in practice sessions. This approach has significant potential to improve the validity of data logs and ultimately to enhance augmentative and alternative communication outcome measures.
Metrics
Published
How to Cite
Issue
Section
License
Copyright (c) 2021 Authors
This work is licensed under a Creative Commons Attribution 4.0 International License.