Thesis Defense

 

"Shaping the Scientific Hypothesis Generation Process through the Design of Visual Analysis Tools"

Hua Guo

Monday, May 22, 2017 at 2:00 P.M.

Lubrano Room 477 (CIT 4th Floor)

I will present a set of findings and methods that help advance the design of visual analysis tools for scientific reasoning. It is well established that visual analysis tools can facilitate data analysis and insight generation in many data-intensive applications. However, relatively less work has been done to explore how these tools could facilitate more complex reasoning processes, such as the generation of hypotheses in scientific research.

In the first part of the talk, I will present results that show we can improve the quality of analysis and decision making during hypothesis generation through appropriate choices of visual encodings and visual structures to represent the data. Specifically, I will discuss how the choice of visual variables influences the perception of uncertainty in graphs and how an augmented word tree visualization can help the user overcome biases that arise during sequential processing of text data. These generalizable findings can lead to deeper understanding of the interaction between the human mind and the visual analysis tools during scientific reasoning. Such an understanding can be translated into design guidelines that will enable visualization researchers and practitioners to design more effective visual analysis tools for other scientific domains and reasoning tasks.

A challenge that accompanies the design of visual analysis tools for supporting reasoning is the difficulty in evaluating the utility of such tools. Thus, in the second part of the talk, I will introduce a novel evaluation approach, motivated by the benefits and limitations of the standard insight-based evaluation. Our approach helps visualization designers pinpoint insight-generating analysis activities by extracting and correlating common interaction patterns with quantified user insights. I will briefly discuss how we applied the evaluation approach to characterize how users accomplish analysis tasks using two visual interfaces and how the evaluation approach can increase the pace of evaluating and improving such interfaces.

Host: Professor David Laidlaw