|dc.description||Title from PDF of title page viewed June 13, 2019||
|dc.description||Thesis advisor: Reza Derakhshani||
|dc.description||Includes bibliographical references (pages 76-79)||
|dc.description||Thesis (M.S.)--School of Computing and Engineering. University of Missouri--Kansas City, 2018||
|dc.description.abstract||In the past, computers - whether personal or at work - required a mouse and key
board to interact with them, and they are still used to this day. Even for video games a
physical tool (controller) is needed to interact with the gaming environment. Previously
that was acceptable since that was how these electronic devices were conceived, but with
the recent boom in Virtual reality (VR) and Augmented Reality (AR), that reality has already started to change. VR and AR have existed for well over 2 decades , yet only in
the last 5 years have they started getting closer to reaching their true potential. With this
new technology, we can go to virtual worlds, interact with creatures that never previously
existed, and visualize information in ways never thought possible before.
With the emergence of VR came the need to change the way we interact with
the virtual environment and, with that, the way we interact with technology as a whole.
And what better controller for the job than the human hand. If the user can interact with
technology with hand gestures then the whole process becomes intuitive, eliminating any
training time, and giving the user a more natural experience. For this, Hand Gesture
Recognition (HGR) systems will be needed.
HGR systems recognize the user’s hand shape by means of a glove, cameras, or
biosignals. One particularly useful biosignal for this task is the forearm Electromyographic (EMG) Signal. This signal reﬂects the contraction state of the forearm muscles.
EMG signals are already being used in prosthetics to help amputees have more natural
control over their prosthetic limbs. They can also be used for translating sign language,
or just generally in Human-Machine-Interaction (HMI).
This work proposes a method to interact with computers using hand gestures,
speciﬁcally for a Computer Aided Design (CAD) software known as Solidworks. To
achieve this a commercial EMG armband (the Myo - Thalmic Labs) was used to record
8-channel EMG signals from a group of volunteers over the span of 3 visits. The data
set was then preprocessed and segmented. The resulting data set consisted of 10 hand
gestures performed by 10 subjects, with 162 samples per gesture. A total of 11 feature
sets were extracted and applied to 4 different machine learning models.
A 9-fold cross validation and testing was performed and the classiﬁers over all the
feature sets were evaluated and compared. The best model validation performance was
achieved by the Linear Discriminant Analysis (LDA) model with an average Area Under
Curve (AUC) of 76.35% and an average Equal Error Rate (EER) of 29.73%
In future work, we propose to use the HGR method developed in this thesis in
multiple applications such as mapping certain shortcut commands in Solidworks (and
other applications) to hand gestures.||eng
|dc.description.tableofcontents||Introduction -- Related work -- Data collection -- Signal preprocessing -- Machine learning models used -- Results and discussion -- Appendix A. Tables and figures -- Appendix B. IRB approval and consent forms||
|dc.format.extent||xiv, 80 pages||
|dc.publisher||University of Missouri -- Kansas City||eng
|dc.subject.other||Thesis -- University of Missouri--Kansas City -- Engineering||
|dc.title||Hand Gesture Recognition via Electromyographic (EMG) Armband for CAD Software CONTROL||eng
|thesis.degree.discipline||Electrical Engineering (UMKC)||
|thesis.degree.grantor||University of Missouri--Kansas City||
|thesis.degree.name||M.S. (Master of Science)||