Abstract:
Humans learn about the environment by interacting with it. With an increasing
number of robotic and prosthetic devices, there is a need for intuitive Muscle-
Machine Interfaces (MuMIs) that allow the user to have an embodied interaction
with the devices they are controlling. For this reason, in this Ph.D. thesis, we focus
on the analysis, development, and evaluation of muscle machine interfaces for an
intuitive control of robotic and bionic devices. In particular, we propose three different
types of MuMIs that focus on: i) classification based intention decoding, ii)
regression based object motion decoding during the execution of dexterous in-hand
manipulation tasks, and iii) formulation of shared control schemes that combine
electromyography (EMG) and other external sensors for facilitating the execution
of complex telemanipulation tasks. For the classification based interfaces, a framework
was proposed that combines an EMG based intention decoding scheme with
fiducial markers based pose tracking for intuitive telemanipulation of a dexterous
robot arm hand system. To make such a system completely portable, a new wearable
and portable MuMI that utilizes EMG and forcemyography (FMG) based
sensors was designed and developed. The applicability of this wearable MuMI was
assessed using a soft exoskeleton glove developed for rehabilitation and human capability
augmentation. To facilitate intuitive and dexterous control while manipulating
an object using MuMIs, an EMG based object motion decoding framework
was proposed. To the best of our knowledge, this was the first study that focused
on the EMG based decoding of object motion during dexterous manipulation, evaluating
also the importance of the muscles of the human hand and the forearm.
The decoding models developed in this study were evaluated for their capability
to generalize for different motions, objects, and subjects. Finally, to improve the
intuitiveness of the MuMIs for complex task execution, two shared control based
frameworks were proposed that utilize external sensors along with EMG based
interfaces to assist the user during task execution. The first framework utilizes a
MuMI that enables the user to achieve myoelectric control of the robot platform and perform complex tasks whenever autonomous execution is complex to achieve
or even infeasible. The second framework combines EMG based arm motion decoding
and a compliance controller to teleoperate a robotic arm hand system in
the execution of complex tasks that require exertion of forces and interaction with
the environment.