Jeff Huang Wins NSF CRII Grant And Salomon Award
- Posted by Jesse Polhemus
- on March 17, 2015

Brown CS community members continue to win noteworthy grants and awards. To read more articles click here.
Less than two years after his arrival at Brown University’s Computer Science Department, Assistant Professor Jeff Huang has received a Richard B. Salomon Faculty Research Award from Brown’s Office of the Vice-President for Research as well as a National Science Foundation (NSF) Computer and Information Science and Engineering (CISE) Research Initiation Initiative (CRII) grant.
CRII is a new program aimed at encouraging research independence among scientists in their first academic position, and Jeff is its first grant recipient at Brown CS. The Salomon Award, given annually, was established to support excellence in scholarly work by providing funding for selected faculty research projects of exceptional merit with preference given to junior faculty who are in the process of building their research portfolio. Jeff joins multiple previous Brown CS winners, including Stefanie Tellex, Rodrigo Fonseca, and Ugur Cetintemel.
“This research is about democratizing eye tracking,” Jeff explains. “It’s extraordinarily useful in applications ranging from human-computer interaction studies to medical research, but the tracking devices are highly specialized, can cost tens of thousands of dollars, and are difficult to calibrate and use. That’s restricted their availability to mostly on-site labs.”
“On the other hand,” he says, “many of us have one of the low-end webcams that are widely available around the world. I intend to provide one of the first opportunities for turning them into tools fit for professional study. This can be done by using user interactions to continuously calibrate the eye tracker during regular activity.”
Jeff’s earlier research reveals that when a user clicks on a web page, they first look where they intend to click, and that the eye is likely to be two to four characters to the right of the last typed character on the screen. Webcam images during these user interactions can be collected by the website to use as cues for what the user’s pupil looks like when that user interacts with a particular location. Future observations of the pupil can be matched to past instances with similar-looking pupils as the eye tracking system collects mapping of pupil features to eye-gaze locations on the page, allowing a model to infer the eye-gaze location even when the user is not interacting.
“Collaborating with Jeff is an exceptional experience,” says PhD candidate Alexandra Papoutsaki, one of his collaborators. “He has a diverse background and can bring together concepts from different domains. As part of designing our first eye-tracking algorithm, we’re working with Professor James Hays and his Master’s student, Patsorn Sangkloy.”
“It's a challenging problem,” Patsorn says, “especially when people's eyes can be so different. I think we've made good progress, and I can't wait to see where eye-tracking could lead to.”
“I believe it’s going to play an important role in the development of future technology,” says Alexandra. “Eye-tracking is far more inclusive than touch or click interactions and can even be used by people with motor impairments. Once it reaches a stable phase, I think that we’ll be surprised by the unexpected uses that appear."
“This is special because it frees eye tracking from the confines of the lab,” Jeff says. “Sharing my work and source code will set this technique loose so it can be used in a broad range of applications. With no need to install additional software, eye tracking can go anywhere. Think of the possible output: everything from new types of online games to superior website navigation for the impaired to improved search engine results could become part of the natural Web experience of everyday users.”