<video controls width="512" height="288" poster="http://streamod.cs.brown.edu:8801/J/Z/lores.jpg" > <source type="video/mp4" src="http://streamod.cs.brown.edu:8801/J/Z/lores.mp4" /> <source type="video/ogg" src="http://streamod.cs.brown.edu:8801/J/Z/lores.ogv" /> <applet code="com.fluendo.player.Cortado.class" archive="/cortado/cortado.jar" width="512" height="288"><param name="url" value="http://streamod.cs.brown.edu:8801/J/Z/lores.ogv"/></applet></video>
Assistant Professor of Computer Science at Dartmouth College
Thursday, February 16, 2017 at 4:00 PM
Room 368 (CIT- 3rd floor)
Unleash Wearable Interactions from the Disappearing Touchscreens
The ubiquitous touchscreen has become the primary mechanism with which users interact with small personal computing devices. While there is a trend showing that personal computing devices may become smaller and smaller, a primary constraint on device miniaturization is the user interface (e.g. touchscreen). Screens need to be large enough to be seen, and keyboards need enough physical space to facilitate typing. Arbitrary hardware iniaturization may lead to devices that are not usable. In this talk, I will present my work in extending the interaction space of wearable devices from the touchscreens to several new dimensions through 1) new sensing techniques, 2) novel device form factors, and 3) new input mechanisms. I will show you a couple of examples of new sensing techniques we developed, including a finger-worn device which allows touch input to be carried out on any surface available to the users and a smartphone capable of ‘seeing’ user’s input in the surrounding environment using the front-facing camera augmented by an omni-directional lens. I will also show you how tangible interactions can be performed on the small wearable form factor using a dual-display smartwatch. Additionally, I will demonstrate a few new smartwatch interactions enabled by an actuated watch face. Finally, I will talk about one-handed input on smartwatches and give you an example to demonstrate that common touchscreen input on a smartwatch can be carried out using only one hand by whirling the wrist of the hand wearing the smartwatch. I will conclude my talk by presenting my vision of how future wearable devices can be developed to improve people’s daily activities.
Xing-Dong Yang is in his second year as an Assistant Professor of Computer Science at Dartmouth College. Xing-Dong completed his
Bachelor of Computer Science in 2005 from the University of Manitoba, Canada. He earned his Master of Computing Science with a specialization in Haptic Interfaces in 2008 from the University of Alberta, Canada and his Doctorate in Computing Science with a specialization in Human-Computer Interaction in 2013 from the same university. During his graduate work he was a research intern at Autodesk Research in Toronto and Microsoft Research Asia in Beijing. His dissertation work was awarded the 2013 Bill Buxton Best Canadian HCI Dissertation Award, given annually for the best doctoral dissertation completed at a Canadian university in the field of human-computer interaction. He has over twenty-five publications in top-tier venues in HCI, including the ACM Conference on Human Factors and Systems (ACM CHI) and the ACM Conference on User Interfaces and Technology (ACM UIST). His work has also been recognized through best paper nominations at ACM CHI and ACM MobileHCI, as well as featured in public press through Discovery News, NBC, and New Scientist.
Xing-Dong’s work is currently funded by Microsoft and NSF.
Host: Professor Jeff Huang