Abstract:
Body-Centric interaction allows people to perform operations outside the device’s
screen without working within a small viewport or input area. Previous
works exploring body-centric interaction have been implemented using commercial
sensors such as cameras, depth sensors, Inertial Measurement Units (IMUs),
and so on. However, these have privacy issues and perform poorly under noise
such as occlusion, illumination and hand movements. To that end, this thesis explores
how radar sensing can support body-centric interaction. The first part of
the thesis introduces RadarHand, an on-body interface using a wearable radarbased
system on the wrist for proprioceptive input gestures. I introduce the
gesture design based on hand topography, where I conducted a proprioceptive
gesture analysis. I then grouped and trained these gestures using deep learning
models to establish which gesture combinations are most suitable for our
use cases. Finally, I evaluated a real-time model and reported its performance
and shortcomings. Next, I introduce RadarDesk, an around-body interface using
radar-based identification (ID) for tangible interactions. RadarDesk can track
and identify objects on a tabletop scale. I classify different objects embedded
with low-cost radar reflectors on a tabletop setup. I also introduce Stackable
IDs, where each objects can be stacked to produce unique IDs. The result allows
RadarDesk to accurately identify visually identical objects embedded with different
low-cost reflector configurations. When combined with a radar’s ability
for tracking, it creates novel around-body interaction modalities.