Sensors, Vol. 24, Pages 2519: Human Action Recognition and Note Recognition: A Deep Learning Approach Using STA-GCN

Sensors, Vol. 24, Pages 2519: Human Action Recognition and Note Recognition: A Deep Learning Approach Using STA-GCN Sensors doi: 10.3390/s24082519 Authors: Avirmed Enkhbat Timothy K. Shih Pimpa Cheewaprakobkit Hufigureman action recognition (HAR) is growing in machine learning with a wide range of applications. One challenging aspect of HAR is recognizing human actions while playing music, further complicated by the need to recognize the musical notes being played. This paper proposes a deep learning-based method for simultaneous HAR and musical note recognition in music performances. We conducted experiments on Morin khuur performances, a traditional Mongolian instrument. The proposed method consists of two stages. First, we created a new dataset of Morin khuur performances. We used motion capture systems and depth sensors to collect data that includes hand keypoints, instrument segmentation information, and detailed movement information. We then analyzed RGB images, depth images, and motion data to determine which type of data provides the most valuable features for recognizing actions and notes in music performances. The second stage utilizes a Spatial Temporal Attention Graph Convolutional Network (STA-GCN) to recognize musical notes as continuous gestures. The STA-GCN model is designed to learn the relationships between hand keypoints and instrument segmentation information, which are crucial for accurate recognition. Evaluation on our dataset demonstrates th...
Source: Sensors - Category: Biotechnology Authors: Tags: Article Source Type: research