Brown CS News

Jeff Huang Wins An NSF CAREER Award For Modeling User Touch And Motion Behaviors For Adaptive Interfaces In Mobile Devices

jeff_huang_juggling.gif

When we're balancing a tablet on our stomach to read in bed instead of holding a smart phone in one hand while pushing a stroller, why shouldn't the user interface be context-specific and ability-specific?

Even a few years ago, asking this question would have been unthinkable or at least highly optimistic, but the pervasiveness of today's mobile devices requires an answer. Assistant Professor Jeff Huang of Brown University's Department of Computer Science is eager to provide it, and he's just won a National Science Foundation (NSF) CAREER Award for his work on modeling user touch and motion behaviors for adaptive interfaces in mobile devices. 

"This award," says Department Chair Ugur Cetintemel, "is a well-deserved recognition of Jeff's exciting teaching and research agenda that integrates granular user modeling methods enabled through practical, non-disruptive human behavioral data capturing techniques." 

"An NSF CAREER award is the gold standard of professional standing for a junior faculty member," adds Professor Andy van Dam, "and it's a great honor for Jeff and for the department that he got one. I especially resonate to his research topic, customizing the User Interface of mobile platforms to particular use cases and users."

CAREER Awards are given in support of outstanding junior faculty teacher-scholars who excel at research, education, and integration of the two within the context of an organizational mission, and Jeff joins multiple previous Brown CS winners of the award, including (most recently) Tim Kraska, Erik Sudderth, and Ben Raphael. His research aims to analyze data captured through interaction with mobile devices, then use it for interface personalization, eventually leading to gains in both usability and accessibility for all users and especially for underrepresented groups, such as combat veterans.

Just as one example out of many, imagine the frustrations (perhaps very familiar) of a user who always taps to the right of a particular button. Simply put, Jeff's research group will address this through three objectives:

  1. investigating how to passively capture touch and motion data, then determine the user's environmental context and analyze their habits and mistakes
  2. incorporating sensors to train an eye tracking model to detect the user’s attention
  3. using information inferred from the first two objectives to improve usability and accessibility 

In the end, this could result in automatic changes to the hittable area of targets, text size (based on a user's vision level), or the layout of the interface to prevent the inaccurate tapping. Anyone who has ever mistakenly launched a banner ad while surfing the Web from their smart device is likely to see the benefit. The research will also be used in a sketching tool that will be deployed for user interface design activities in Jeff's CSCI 1300 User Interfaces course. 

“I think we don’t like to admit it," Jeff says, "but people are using mobile devices while they're mobile. They're checking directions while walking with groceries, and even swiping while driving. This research seeks to automatically figure out the posture and environment through sensors on the device so interfaces can be more usable and accessible.”