Humans engage in a wide variety of daily activities by constantly interacting with the environment physically. Recording, modeling, and augmenting such physical interactions are fundamental to understanding human behaviors, promoting health monitoring and delivery, and fostering human-centric intelligent system designs. However, challenges arise from the pervasive and diverse nature of physical interactions: they occur across the human body at extended duration, are subjectively perceived by individuals and involve diverse input-output modalities. Practically deployable integrated interfaces for physical interactions are required to be scalable, seamlessly integrated, robust, and adaptable.
In this talk, I will present three integrated textile-based systems for the recording, modeling, and augmentation of tactile interactions. First, I will introduce digital machine knitted full-sized tactile sensing garments for learning human-environment interactions. Then, I will briefly showcase the recording and modeling of tactile interactions in an ambient sensing scenario via an intelligent carpet. Lastly, I will describe adaptive tactile interactions transfer via digitally embroidered smart gloves. These innovations exemplify the opportunities by the combination of digital fabrication and artificial intelligence, enabling seamless observation of human activities, in-depth analysis of interactions with the surroundings, and strategies to augment our behaviors and intelligent systems.
- Tags
-