Despite the rapid advancement of AI, computers' ability to comprehend human behaviors remains limited. For instance, commodity computing devices still face challenges in understanding even basic human daily activities such as eating and drinking. The primary obstacle lies in the absence of suitable sensing technologies capable of capturing and interpreting high-quality behavioral data in everyday settings. In this presentation, I will share my research on the development of everyday wearables that are minimally-obtrusive, privacy-aware, and low-power, yet capable of capturing and comprehending various body movements and poses that humans employ in their everyday activities. First, I will show how these sensing technologies can empower various everyday wearable form factors, including wristbands, necklaces, earphones, headphones, and glasses, to track essential body postures, such as facial expressions, gaze, finger poses, limb poses, as well as gestures on teeth and tongue. Then, I will demonstrate how, when paired with state-of-the-art AI, these everyday wearables can revolutionize how computers comprehend human behaviors. Specifically, I will focus on applications related to activity recognition, accessibility, and health sensing. Finally, I will discuss the prospects and challenges associated with the integration of AI and wearables to support users in the future of everyday computing.
- Tags
-