Yang Zhang is a 5th (final) year Ph.D. candidate at Human-Computer Interaction Institute (HCII), School of Computer Science, Carnegie Mellon University, advised by Prof. Chris Harrison. He is also a Qualcomm Innovation Fellow.
He develops sensing techniques for next-generation human-computer interfaces that bring computing resources to users instead of forcing users to go to computing resources. Specifically, he builds sensors to enable natural interactions on mobile devices beyond touchscreens. He also creates sensors to detect user activities for personal and environment informatics which can lead to healthier and more efficient lives.
He publishes at CHI (ACM CHI Conference on Human Factors in Computing Systems) and UIST (ACM Symposium on User Interface Software and Technology), and has received 5 Best Paper (1%) and Honorable Mention Awards (5%).
Taxonomies of his completed research can be found below. More exciting projects are on the way.
[Research focus diagram inspired by professor Bjoern Hartmann]
Y Zhang, Y Iravantchi, H Jin, S Kumar and C Harrison (to appear at UIST 2019) [PDF upon requests]
Sozu a low-cost sensing system that can detect a wide range of events wirelessly, through walls and without line of sight, at whole-building scale. Instead of using batteries, Sozu tags convert energy from activities that they sense into RF broadcasts, acting like miniature self-powered radio stations.
Honorable Mention Award
Vibrosight senses activities across entire rooms using long-range laser vibrometry. Unlike a microphone, our approach can sense physical vibrations at one specific point, making it robust to interference from other activities and noisy environments. This property enables detection of simultaneous activities, which has proven challenging in prior work.
Best Paper Award
Wall++ is a low-cost sensing approach that allows walls to become a smart infrastructure. Our wall treatment and sensing hardware can track users' touch and gestures, as well as estimate body pose if they are close. By capturing airborne electromagnetic noise, we can also detect what appliances are active and where they are located.
In this work, we explore the notion of general-purpose sensing, wherein a single, highly capable sensor can indirectly monitor a large context, without direct instrumentation of objects. Further, through what we call Synthetic Sensors, we can virtualize raw sensor data into actionable feeds, whilst simultaneously mitigating immediate privacy issues.
Electrick is a low-cost and versatile sensing technique that enables touch input on a wide variety of objects and surfaces, whether small or large, flat or irregular. This is achieved by using electric field tomography in concert with an electrically conductive material, which can be easily and cheaply added to objects and surfaces.
Tomo recovers the interior impedance geometry of a user's arm by measuring the cross-sectional impedances from surface electrodes resting on the skin. We integrated the technology into a prototype wristband, which can classify gestures in real-time. Our approach is sufficiently compact and low-powered that we envision this technique being integrated into future smartwatches to allow hand gestures to work together with touchscreens.
We improved our prior work on wearable Electrical Impedance Tomography with higher sampling speed and resolution. In turn, this enables superior interior reconstruction and gesture recognition. More importantly, we use our new system as a vehicle for experimentation -- we compare two EIT sensing methods and three different electrode resolutions.
We developed a sensing technique for paper to track finger input and also drawn input with writing implements. Importantly, for paper to still be considered paper, our method had to be very low cost. This necessitated research into materials, fabrication methods and sensing techniques. We describe the outcome of our investigations and show that our method can be sufficiently low-cost and accurate to enable new interactive opportunities with this pervasive and venerable material.
Y Zhang, W Kienzle, Y Ma, S S. Ng, H Benko, C Harrison (to appear at UIST 2019) [PDF upon requests]
ActiTouch is a new electrical method that enables precise on-skin touch segmentation by using the body as an RF wave-guide. We combine this method with computer vision, enabling a system with both high tracking precision and robust touch detection. Our system can enable touchscreen-like interactions on the skin.
Honorable Mention Award
Interferi is an on-body gesture sensing technique using acoustic interferometry. We use ultrasonic transducers resting on the skin to create acoustic interference patterns inside the wearer’s body, which interact with anatomical features in complex, yet characteristic ways. We focus on two areas of the body with great expressive power: the hands and face.
Honorable Mention Award
The mobility of tablets affords interaction from various user-centric postures including shifting hand grips, varying screen angle and orientation, planting the palm while writing or sketching. We propose Posture-Aware Interface which morphs to a suitable frame of reference, at the right time, and for the right (or left) hand.
Honorable Mention Award
SkinTrack is a wearable system that enables continuous touch tracking on the skin. It consists of a signal-emitting ring and a sensing wristband with multiple electrodes. Due to the phase delay inherent in a high-frequency AC signal propagating through the body, a phase difference can be observed between pairs of electrodes, which we use to compute a 2D finger touch coordinate.
AuraSense enhances smartwatches with Electric Field Sensing to support multiple interaction modalities. We identified four electrode configurations that can support six well-known modalities of particular interest and utility, including gestures above the watchface and touchscreen-like finger tracking on the skin.
Pyro is a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine earning.
We propose an approach where users simply tap a smartphone to an appliance to discover and rapidly utilize contextual functionality. To achieve this, our prototype smartphone recognizes physical contact with uninstrumented appliances through EMI sensing, and summons appliance-specific interfaces.
LumiWatch is the first, fully-functional and self-contained projection smartwatch implementation, containing the requisite compute, power, projection and touch-sensing capabilities. Our watch offers more than five times that of a typical smartwatch display. We demonstrate continuous 2D finger tracking with interactive, rectified graphics, transforming the arm into a touchscreen.