I'm a 5th (final) year Ph.D. candidate at Human-Computer Interaction Institute (HCII), School of Computer Science, Carnegie Mellon University, advised by Prof. Chris Harrison. I'm also a Qualcomm Innovation Fellow.
My research goal is to bring computing and interactivity closer to users through enabling computing devices with the knowledge of the immediate physical world around them. Specifically, I have built sensors on mobile devices to enable natural interactions beyond touchscreens. I have also invented deployed sensing technologies to create sensor feeds such as state, count, and rate of activities and events, which enable a broad range of applications such as personal and environment informatics, assistive and autonomous technologies, context-aware applications, and beyond.
I'm on the job market this year, open to both academia and industry positions. Please feel free to reach out and chat!
[Research focus diagram inspired by professor Bjoern Hartmann]
Sozu is a low-cost sensing system that can detect a wide range of events wirelessly, through walls and without line of sight, at whole-building scale. Instead of using batteries, Sozu tags convert energy from activities that they sense into RF broadcasts, acting like miniature self-powered radio stations.
Honorable Mention Award
Vibrosight senses activities across entire rooms using long-range laser vibrometry. Unlike a microphone, our approach can sense physical vibrations at one specific point, making it robust to interference from other activities and noisy environments. This property enables detection of simultaneous activities, which has proven challenging in prior work.
Best Paper Award
Wall++ is a low-cost sensing approach that allows walls to become a smart infrastructure. Our wall treatment and sensing hardware can track users' touch and gestures, as well as estimate body pose if they are close. By capturing airborne electromagnetic noise, we can also detect what appliances are active and where they are located.
In this work, we explore the notion of general-purpose sensing, wherein a single, highly capable sensor can indirectly monitor a large context, without direct instrumentation of objects. Further, through what we call Synthetic Sensors, we can virtualize raw sensor data into actionable feeds, whilst simultaneously mitigating immediate privacy issues.
Electrick is a low-cost and versatile sensing technique that enables touch input on a wide variety of objects and surfaces, whether small or large, flat or irregular. This is achieved by using electric field tomography in concert with an electrically conductive material, which can be easily and cheaply added to objects and surfaces.
Tomo recovers the interior impedance geometry of a user's arm by measuring the cross-sectional impedances from surface electrodes resting on the skin. We integrated the technology into a prototype wristband, which can classify gestures in real-time. Our approach is sufficiently compact and low-powered that we envision this technique being integrated into future smartwatches to allow hand gestures to work together with touchscreens.
We improved our prior work on wearable Electrical Impedance Tomography with higher sampling speed and resolution. In turn, this enables superior interior reconstruction and gesture recognition. More importantly, we use our new system as a vehicle for experimentation -- we compare two EIT sensing methods and three different electrode resolutions.
We developed a sensing technique for paper to track finger input and also drawn input with writing implements. Importantly, for paper to still be considered paper, our method had to be very low cost. This necessitated research into materials, fabrication methods and sensing techniques. We describe the outcome of our investigations and show that our method can be sufficiently low-cost and accurate to enable new interactive opportunities with this pervasive and venerable material.
ActiTouch allows users to use their hands and arms as readily available touch input surfaces for AR and VR, opening a new interaction opportunity beyond conventional controllers and in-air gestures. We invented a powerful sensor fusion method which combines an electrical method with computer vision. This enables precise on-skin touch segmentations, which uniquely enables many fine-grained touch interactions such as scrolling and swiping.
Honorable Mention Award
Interferi is an on-body gesture sensing technique using acoustic interferometry. We use ultrasonic transducers resting on the skin to create acoustic interference patterns inside the wearer’s body, which interact with anatomical features in complex, yet characteristic ways. We focus on two areas of the body with great expressive power: the hands and face.
Honorable Mention Award
The mobility of tablets affords interaction from various user-centric postures including shifting hand grips, varying screen angle and orientation, planting the palm while writing or sketching. We propose Posture-Aware Interface which morphs to a suitable frame of reference, at the right time, and for the right (or left) hand.
Honorable Mention Award
SkinTrack is a wearable system that enables continuous touch tracking on the skin. It consists of a signal-emitting ring and a sensing wristband with multiple electrodes. Due to the phase delay inherent in a high-frequency AC signal propagating through the body, a phase difference can be observed between pairs of electrodes, which we use to compute a 2D finger touch coordinate.
AuraSense enhances smartwatches with Electric Field Sensing to support multiple interaction modalities. We identified four electrode configurations that can support six well-known modalities of particular interest and utility, including gestures above the watchface and touchscreen-like finger tracking on the skin.
Pyro is a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine earning.
We propose an approach where users simply tap a smartphone to an appliance to discover and rapidly utilize contextual functionality. To achieve this, our prototype smartphone recognizes physical contact with uninstrumented appliances through EMI sensing, and summons appliance-specific interfaces.
LumiWatch is the first, fully-functional and self-contained projection smartwatch implementation, containing the requisite compute, power, projection and touch-sensing capabilities. Our watch offers more than five times that of a typical smartwatch display. We demonstrate continuous 2D finger tracking with interactive, rectified graphics, transforming the arm into a touchscreen.