newsence
來源篩選

Towards Cross-Subject EMG Pattern Recognition via Dual-Branch Adversarial Feature Disentanglement

arXiv.org

Cross-subject electromyography (EMG) pattern recognition faces significant challenges due to inter-subject variability in muscle anatomy, electrode placement, and signal characteristics. Traditional methods rely on subject-specific calibration data to adapt models to new users, an approach that is both time-consuming and impractical for large-scale, real-world deployment. This paper presents an approach to eliminate calibration requirements through feature disentanglement, enabling effective cross-subject generalization. We propose an end-to-end dual-branch adversarial neural network that simultaneously performs pattern recognition and individual identification by disentangling EMG features into pattern-specific and subject-specific components. The pattern-specific components facilitate robust pattern recognition for new users without model calibration, while the subject-specific components enable downstream applications such as task-invariant biometric identification. Experimental results demonstrate that the proposed model achieves robust performance on data from unseen users, outperforming various baseline methods in cross-subject scenarios. Overall, this study offers a new perspective for cross-subject EMG pattern recognition without model calibration and highlights the proposed model's potential for broader applications, such as task-independent biometric systems.

newsence

透過雙分支對抗性特徵解耦實現跨主體肌電訊號模式辨識

arXiv.org
15 天前

AI 生成摘要

本文提出一種端對端的雙分支對抗性神經網路,透過將肌電訊號特徵解耦為模式特異性與主體特異性成分,實現無需校準的跨主體肌電訊號模式辨識,從而能夠對新使用者進行穩健的辨識,並支援下游應用,如生物特徵識別。

透過雙分支對抗特徵解耦實現跨受試者肌電圖模式識別

電腦科學 > 電腦視覺與模式識別

標題:透過雙分支對抗特徵解耦實現跨受試者肌電圖模式識別

提交歷史

訪問論文:

參考文獻與引用

BibTeX 格式的引用

書籤

BibSonomy logo Reddit logo

書目與引用工具

與本文相關的程式碼、資料和媒體

演示

推薦器與搜尋工具

arXivLabs:與社群合作者的實驗性專案

arXivLabs 是一個框架,允許合作者直接在我們的網站上開發和分享新的 arXiv 功能。

與 arXivLabs 合作的個人和組織都擁抱並接受了我們開放、社群、卓越和使用者資料隱私的價值觀。 arXiv 致力於這些價值觀,並且只與遵守這些價值觀的合作夥伴合作。

對於一個能為 arXiv 社群增加價值的專案有想法嗎? 了解更多關於 arXivLabs 的資訊。