CSCI 2300: Human-Computer Interaction Seminar

Spring 2023

This seminar covers methods for conducting research in human-computer interaction (HCI). These topics will be pursued through independent reading, assignments, and class discussion. The seminar comprises four assignments that not just apply HCI research methods but push the envelope of what has been done before. The assignments are designed to be meaningful and potentially discover something new in the field, and students will also attend HCI faculty candidate talks this semester as part of this course. We will have readings that teach HCI methods and provide examples of research contributions, sometimes alongside the reviews of those papers as they were evaluated for publication.

The goal of this course is to provide students with the background necessary to perform research in HCI and the skills required to conduct human-centric research. Students who take this course should have a particular interest in HCI research, or wish to learn fundamental skills that will help them with a user interface design or usability evaluation career. There will be little or no content in this course about interface design, but students will find other topics in CSCI 1300 (User Interfaces) relevant. Enthusiastic students who have not taken CSCI 1300 should have independently gained HCI experience or be a graduate student studying a related topic, and be able to manipulate software and data to investigate the research questions posed in class.

The course is capped at about 20 students, and there is no waitlist or addition enrollment possible at this point. The Collaboration Policy should be read and signed in class on February 8.

Main Themes (Spring 2023)

1) HCI methods, especially empirical experimental design
2) social mechanisms in digital communication
3) self-experimentation
4) AI and crowdwork ethics
5) creatively learning and learning creativity

Course Staff

Instructor: Jeff Huang, 245 CIT, jeff at cs dot brown dot edu

Meeting Times

4:30pm-7pm Wednesday at 241 CIT. Office hours on Wed 2:30-3:30pm. The seminar is fully in-person, without a remote option.

Assignments

Evaluating (S)ocial Mechanisms Compare and contrast fast-prototyped social apps that we design together that each apply different norms and constraints using a variety of mechanisms to see how small changes affect wellness, trust, privacy, and enjoyment.

Self-E(x)periment Designs: Rather than conducting generalizable experiments on samples of the population, you will perform an N = 1 experiment (a self-experiment) to see how changing your behavior affects your own outcomes

Constructing an (E)thical Framework: What should modern human subjects review look like for computing studies? Propose a change to the federal "common rule" from which institutional review boards derive their rules, to consider modern perspectives of labor, data ownership, power dynamics, and the risks of deanonymization.

Measuring (C)reativity: We'll be reviewing measures of creativity and inspiration that's typically used for human-generated content, and seeing how they perform on AI-generated content. What is creativity, and is there something there that's unique to humans?

Grading

Readings should be done before class on the date a reading is due. For each reading, please write to the Slack channel a short novel comment (not a rephrase of what someone said earlier) about the research contribution/findings from the work, and a short novel comment about your assessment of the work/paper. Comments are encouraged to be in response to existing comments in the channel. Additionally, each reading discussion will be led by two students who will work together to prepare a presentation for the class.

Assignments are due at the beginning of class on the date it is marked "in" in the schedule below, with a midpoint check-in on the dates marked "mid" where we'll discuss progress so far. Students should attend at least 4 faculty candidate talks and submit a combined review of their research quality and potential, with a final comparison between the HCI research work and visions that were presented by each faculty candidate.

Grading is done solely by the instructor. The thresholds for A/B/C cutoffs are 90/80/70.

Time Allocation

Total time spent in and out of class for this course is estimated at 180 hours. Over the 15 weeks of this course, students will spend 2 and a half hours in class each week (or 37.5 hours total). Although specific out-of-class time investments may vary for individual students, a reasonable estimate to support this course's learning outcomes is 145 total out-of class hours, or on average, about 10 hours weekly over a 15-week term. Out-of-class preparation will regularly include about 1-2 hours per class of reading and writing the comments addressing the reading (about 70 hours total), along with presentations. In addition to this ongoing preparation time, students are expected to allocate about 65 hours over the course of the term to writing the four assignments. Finally, approximately 5 hours will be spent attending HCI faculty candidate talks.

Accessibility and Accommodations Statement

Brown University is committed to full inclusion of all students. Please inform me early in the term if you may require accommodations or modification of any of course procedures. You may speak with me after class, during office hours, or by appointment. If you need accommodations around online learning or in classroom accommodations, please be sure to reach out to Student Accessibility Services (SAS) for their assistance (sas@brown.edu, 401-863-9588). Undergraduates in need of short-term academic advice or support can contact an academic dean in the College by emailing college@brown.edu. Graduate students may contact one of the deans in the Graduate School by emailing graduate_school@brown.edu.

Classwork Schedule

Day Topics / Reading Due Assignment
Jan 25 Topic: overview, HCI research, and reading critically
Keshav - How to Read a Paper
Kostakos - The Big Hole in HCI Research
Feb 1 Topic: social relationships and norms
Gilbert - Predicting Tie Strength With Social Media
Wei - TikTok and the Sorting Hat, Seeing Like an Algorithm, and American Idle
(S) out
Feb 8 Topic: social prototyping and evaluation
Bernstein - The trouble with social computing systems research
Epstein - Revisiting Piggyback Prototyping: Examining Benefits and Tradeoffs in Extending Existing Social Computing Systems
(S) mid
Feb 15 Topic: validity
Ernala - Methodological Gaps in Predicting Mental Health States from Social Media: Triangulating Diagnostic Signals
Losh - Reliability, Validity, Causality, And Experiments + Norvig - Warning Signs in Experimental Design and Interpretation
Feb 22 Topic: experiment design
Gwern - Weather and My Productivity (or one of their other QS reports)
Daskalova - Lessons Learned from Two Cohorts of Personal Informatics Self-Experiments
(S) in, (X) out
Mar 1 Topic: interventions, causality
Discussions from online about A/B experiments (read in order) [1] [2] [3] [4]
Munson - The Importance of Starting With Goals in N-of-1 Studies
(X) mid
Mar 8 Topic: crowdwork ethics
Marcus - How I Learned to Stop Worrying and Love the Crowd
Alkhatib - Examining Crowd Work and Gig Work Through The Historical Lens of Piecework
Mar 15 Topic: HCI ethics
Brown - Five Provocations for Ethical HCI Research
Amershi - Guidelines for Human-AI Interaction
(X) analysis
Mar 22 No in-person class, share assignment midpoint (X) in, (E) out
Mar 29 Spring Break
Apr 5 Topic: ideation
Siangliulue - Providing Timely Examples Improves the Quantity and Quality of Generated Ideas
Tohidi - Getting the Right Design and the Design Right: Testing Many Is Better Than One
(E) in, (C) out
Apr 12 Topic: working with participants
Buchenau - Experience Prototyping
Dell - "Yours is Better!" Participant Response Bias in HCI
Apr 19 Topic: design research
Odom - Slow Interaction Design
Zimmerman - Research Through Design as a Method for Interaction Design Research in HCI
(C) mid1 on Apr 17
Apr 26 Class moved to next week (C) mid2
May 3 Topic: systems vs evaluation
Landay - I give up on CHI/UIST + Greenberg - Usability Evaluation Considered Harmful (Some of the Time)
CHI - Selecting a Subcommittee (read the descriptions of each subcommittee and check out abstracts from papers that catch your eye)
(C) in

* We will look at the corresponding reviews for some of these papers in class to see what the original reviewers had to say about it.