Putting Socially Responsible Computing At The Heart Of Our Undergraduate Experience
- Posted by Jesse Polhemus
- on Dec. 15, 2020
Click the link that follows for more news about our Socially Responsible Computing program.
A little over a year ago, Brown CS launched the pilot version of the Socially Responsible Computing program, which puts societal and ethical issues at the heart of our undergraduate experience. For decades, we’ve been examining the computer scientist’s impact and role in a world of constant technological acceleration, and this is our most comprehensive effort yet.
It’s markedly different from offerings by our peers, leveraging the strengths of our first-of-its-kind Undergraduate Teaching Assistant (UTA) program, and thousands of students across fourteen courses have already taken part. Below, we situate the initiative among related Brown CS endeavors to date, explore its best practices and lessons learned through the eyes of its participants, and look ahead to its continued evolution in a world that’s seen profound societal change in just the past twelve months.
“We’re in a learning stage on the way to a robust program,” says Department Chair Ugur Çetintemel, “and if we do things well, the line between ethical content and technical content will eventually disappear – we want socially responsible computing to be part of how everyone at Brown CS does things every day.”
Science And Science Fiction
“Current notions of unethical and irresponsible computing were largely considered science fiction until recently,” says Professor Tom Doeppner. Although the concept of ethical computing dates back almost 80 years to work by Norbert Wiener of MIT, early examples aren’t easily found outside of imaginative portrayals like Asimov’s Three Laws of Robotics (1942) or Harlan Ellison’s AM (1967), a fiendish artificial intelligence that brings about the near-extinction of humanity.
“In the early days,” says Professor Andy van Dam, “computer science was about getting the best performance from old algorithms or creating better new ones. Anything new that improved the software ecosystem was perceived as good, and social impact was less important than bigger, cheaper, easier- to-use artifacts.”
“It was a utopian mindset,” adds Brown CS alum Sharon Lo, who now works on Microsoft’s Ethics team, “the idea that technology automatically betters us in all ways.”
One of the earliest Brown CS examples of responsible computing was van Dam’s own Computers for Poets course, which launched in the early 1970s. Taught in BASIC, it showcased applications of computer science in the humanities, with the content divided equally between computational thinking (conditional branching, loops, subprograms) and societal impact (misuse of textual analysis, computers masquerading as people, security versus privacy).
“There were fears of Big Brother,” Andy says, “of having your name in a government database, but we were also looking at the digital divide at the dawn of the online world. At a time when very few people had access to computers, ‘for whom is computing intended’ was a major ethical question of the day.”
As the 1980s gave way to the 1990s, discussions of computing’s influence and reach continued. Professor John Savage’s Computers and Society course, debuting in 1983, maintained a similar focus on societal impact. As Professor Steve Reiss explains, there was also a growing attention to well-meaning software that occasionally had disastrous consequences due to poor design or unanticipated edge cases.
“I’d start CS 32 by talking about user interface design causing the plane crash that killed [Brown CS Professor] Paris Kanellakis,” he says. “Ethics in software engineering already had a long history, derived from engineering ethics – the responsibility to get software right so it doesn’t cause harm. Later, there was a big push for universal access and internationalization of web apps, but even that was very different from what we’re talking about now.”
Additional ethical questions were emerging simultaneously in other areas of the field. With the rise of Photoshop and similar applications, deceptive alteration of visual media had become an increasing concern, and Andy added the topic to his graphics courses in the early 2000s, expanding the content over the years that followed to cover topics such as facial recognition and today’s deep fakes.
The new millennium marked the arrival of Professor Roger Blumberg’s CS 9 Computers and Human Values in 2002. Part of Brown’s newly-formed First-Year Seminar Program, it looked at how technology has transformed not only the lives and practices of individuals and institutions, but also accepted ways of thinking about these practices. Roger organized his course around philosophical questions concerning humanity and the future of societies, and his method was to have students read both contemporary and classic texts by computer scientists, historians of science and technology, and philosophers. A primary aim of the course was to allow students to reflect on questions concerning technology by gaining an understanding of how those questions have been posed and answered in the past as well as the present.
“Introducing humanities texts in computer science is challenging,” Blumberg says, “because this kind of reading is not part of a typical CS curriculum. Sometimes computer science students confuse being critical with being cynical, but important texts help make the distinction clear. Conflict may seem like a harsh word to use these days, but the conflict of perspectives about the meaning and consequences of computing, and technology more generally, is essential for a thoughtful, responsible computer science.”
Despite everything that had come before, 2016 was different. Sharon Lo describes the Cambridge Analytica data breach scandal as a wakeup call about the necessity of responsible computing: “I think technology had reached a tipping point. Instead of computers being limited to people of a certain class, or one per household, we were seeing smart devices with us at all times, for a vast number of people worldwide. That had an impact on our culture at large, on the fabric of our society.”
Seeing what he describes as a direct threat to democracy, Andy van Dam added an “IT in the News” preamble to most of the lectures in CSCI 0150. His goal was to have students look at the societal implications of CS, juxtaposing ethical dilemmas like the Trolley Problem with recent advances like the groundbreaking AI work of the AlphaGo team, demonstrating not that technology is evil but that it poses moral quandaries.
Others were doing the same: in 2018, John Savage began including material on artificial intelligence and ethics in CSCI 1800 Cybersecurity and International Relations and also used EMCS 2600 The Future of Cybersecurity: Technology and Policy to pose questions of responsible computing to the first cohort of Brown’s Executive Master’s in Cybersecurity.
2016 was also notable for the creation of a new student group, CS for Social Change (http://cssc.cs.brown.edu), whose mission was to promote dialogue and action within the intersection of technology and social good. They soon found common cause with departmental goals for ethical and responsible computing, and their enthusiastic activism was a major contributor to the breaking of new ground. Thanks in part to their efforts, Brown CS joined the Swearer Center’s Engaged Scholars Program, which in the context of CS challenges students to consider the ethical implications and societal impact of emerging technologies, and also revised its Industry Partners Program to offer complimentary memberships to nonprofits and organizations working in the social good sector.
The group’s efforts also led to a new course. Students Elaine Jiang and Nikita Ramoji approached Ugur Çetintemel with material they’d compiled that united CS and social good, combining it with his own work in the same area to create CSCI 1951I CS for Social Change. The class focused on ongoing project collaborations (essentially, remote internships) with three nonprofits: Elaine’s students worked with Baker Ripley, building a web portal that aims to improve communications between the elderly and people who have been recently placed in the stressful, emotional role of caregivers; Nikita’s team worked with Uplift to build a Chrome extension to autofilter content and prevent online harassment; and UTA Yuta Arai led a group of students who worked with YGA, a Turkish non-profit organization, to build a web portal accessible by a mobile app. The app interfaces with kits containing electronic hardware to help Syrian refugee children learn STEM concepts.
“In hindsight,” van Dam says, “it seems completely unrealistic to think that technology could just be neutral, but it took us a long time to get to that mindset.”
2018 and 2019 also brought new courses with responsible computing at the fore: CSCI 1870 Cybersecurity Ethics, which Adjunct Professor Deborah Hurley describes as being driven by “overwhelming student interest” in how society was being impacted by security and ethical issues, Roger Blumberg’s DATA 0080 Data, Ethics, and Society, and Ugur Çetintemel’s CSCI 1951I CS for Social Change, described above.
Professor Seny Kamara launched CSCI 2950V Topics in Applied Cryptography: Crypto for Social Good in 2019 and CSCI 2952V Algorithms for the People the following year. He says that social responsibility has always been a personal issue, and a sense of obligation led him to integrate topics like algorithmic bias in risk assessment and predictive policing in CSCI 0160 Introduction to Algorithms and Data Structures starting in 2017. In 2018, with collaborators from University of Utah and Haverford College, he was a winner of the Mozilla Foundation’s Responsible Computer Science Challenge, which is funding a multi-year effort to integrate additional responsible computing content in CSCI 0160.
“What’s now termed responsible computing,” he explains, “is something that was always important for me. When you’re part of a marginalized group, politics and policy affect you directly. It was always surprising to me that society put Silicon Valley on a pedestal without paying attention to its harms and that we, as a field, never made social impact an important goal of CS. Privacy and cryptography, my research areas, were one of Silicon Valley’s first obvious, major failures, so this hit close to home in multiple ways. It was clear we needed to do more.”
A Very Brown Approach
As shown above, the idea of ethics content in CS programs is hardly new. As early as the seventies, there have been significant efforts to augment curricula with courses on the ethics and social impacts of computing. “Unfortunately, the single-course model makes it seem that responsible computing is off to the side,” says Professor Kathi Fisler. “In practice, however, these issues should deeply affect the design of technologies in all areas of computing. With so many areas beyond IT now embedding computing into products and policy, we have to prepare students to see these issues within a rapidly-changing landscape.”
That constant change was a key motivation for the Socially Responsible Computing program, which debuted in Fall, 2019. (See https://bit.ly/31sP0Rz for a CS News story about its launch.)
“We’d been addressing these issues in many ways over a period of decades,” says Ugur Çetintemel, “but we needed institutional support in order to scale, to do something coordinated and sustainable that could move forward year over year. CS is much more society-facing now, and we knew we had the perfect position to leverage Brown’s strengths and focus on societal impact.”
At the core of the initiative was a newly-formed group of UTAs, the STAs, chosen for their experience with relevant courses and their demonstrated interest in ethical and societal aspects of computing. Several were joint concentrators with backgrounds that included history, modern culture and media, and philosophy. Under the supervision of Brown CS faculty, the STAs worked with course staff to compile materials for use in CS courses as well as other contexts, plan public lectures and workshops, and create customized content
embedded in courses that examined social responsibility issues in CS, tieing them to real-world challenges.
Jessica Dai was one of two Head STAs that first year. “I was already hyper-aware of how technology was affecting the world,” she says, “but I wasn’t sure what we could offer the community in that area. When Ugur proposed the program, I saw some very clear and achievable steps that we could take.”
The pilot launched with ten STAs, with two also serving as Head STAs. To maximize early exposure, number of students, and impact, the courses that took part were CSCI 0111 Computing Foundations: Data, CSCI 0150 Introduction to Object-Oriented Programming, CSCI 0170 Computer Science: An Integrated Introduction, CSCI 0190 Accelerated Introduction to Computer Science, CSCI 1300 User Interfaces and User Experience, CSCI 1470 Deep Learning, and CSCI 1730 Programming Languages.
“When I was a first-year,” says Hal Triedman, one of the first STAs, “CS ethics hadn’t really grabbed the world’s attention yet, and I think students didn’t realize what the department was doing in that area. But then there was the Engaged Scholars Program, and CS for Social Change, and then the Socially Responsible Computing program. I really liked the department’s decision to formalize things and fully engage. It’s important – these are deep questions, and we need to deal with them in an organized way, covering every topic from philosophical questions to best practices, and not in a repetitive way, or as something that’s just an add-on.”
The STAs developed the following four principles to organize the coverage of ethical/responsible material in courses. Application of the principles varied from course to course:
- It should be consistent with the curriculum and culture of the course.
- At least some component of it should be graded.
- It should be covered as early and as frequently as possible.
- It should be tied to real-world challenges, not abstractions, as much as possible.
One semester later, the program added an additional thirteen STAs and nine courses: CSCI 0112 Computing Foundations: Program Organization, CSCI 0160 Introduction to Algorithms and Data Structures, CSCI 0180 Computer Science: An Integrated Introduction, CSCI 0320 Introduction to Software Engineering, CSCI 1320 Creating Modern Web and Mobile Applications, CSCI 1420 Machine Learning, CSCI 1430 Computer Vision, CSCI 1660 Introduction to Computer Systems Security, and CSCI 1951A Data Science.
Having undergraduates play a central role in the initiative is a uniquely valuable strategy, says Kathi Fisler. “Other schools,” she notes, “have thrown money at the problem, hiring philosophy postdocs. But we’re taking a very Brown approach, centered on our undergrads – when you get them fired up about something, their energy is what makes us Brown CS. Other schools have added an ethics module here or there, but we say: these are design decisions, you’re about to write code! Focusing on real-world application is part of the shift from ethical computing to responsible computing, and that can’t be taught by philosophy students. We’re doing things in line with our culture, and we think that’ll have better results.”
“I think the Socially Responsible Computing program is special,” says Sharon Lo. “I wasn’t surprised that Brown CS would be the first department I’ve heard of with a comparable program. When I see gaps in my knowledge and think about all the ethical areas we need to learn about, that’s when I miss my time at Brown the most. There are no black and white answers, and we need to wrestle with these topics early on, in a comprehensive way.”
Really Thinking, Really Learning
In the last days of 2019, as each new course about to take part in the Socially Responsible Computing Initiative readied itself for the semester ahead, STAs and faculty members established a wide variety of communication methods, workflows, and outputs. Steve Reiss reports that he talked with his STAs individually, and they all attended TA Camp as part of the general group of UTAs. Professor Doug Woos met with his STAs as a group to strategize where ethical content could be added. Kathi Fisler’s STAs for CSCI 0111 created reading and reflection assignments; conversations with his STAs led Professor Tim Nelson to add guest lecturers to CSCI 0320 who spoke about accessibility issues and what they described as “mission-oriented” careers. Other STAs modified lectures and labs.
Brown CS alum Kielan Donahue explains that responsible computing was part of the very first assignment in Professor James Tompkin’s CSCI 1430 Introduction to Computer Vision: “We had to find a recent image in the news that had been altered, whether by computers or not. I found one where a Ugandan climate activist had been cropped out of a photo where the other subjects were white. Starting the class with examples like that was powerful. Each time we turned in homework, James would review it in class and anonymously read our answers to the ethics questions – he and his STAs did a great job of demanding more in every assignment.”
Also one of the Head STAs, Jessica Dai served with Hal Triedman as one of Deep Learning’s STAs. Describing herself as always self-educating, she says that they spent a great deal of time adding content to weekly labs. One of them, on removing bias from language models, was the first moment that she remembers walking through the CIT and hearing students enthusing about an ethics assignment.
“That was really special!” she says. “But by no means do I think just changing the curriculum is enough. The other important thing I hope students come away with is understanding the limitations of their knowledge and when to defer to other experts.”
Jessica says that she and a lot of other students have benefited from serving as UTAs, but helping implement the Socially Responsible Computing program was even more valuable for her: “Being an STA was one of the most exciting intellectual processes I’ve had, and it really goes beyond the typical UTA role: truly designing assignments, being formative in how we teach and learn.”
Brown CS alum Purvi Goel was one of Jessica and Hal’s students. She remembers questions attached to problem sets that asked students to look at the consequences of each release. “I really liked the assignment about classifying handwritten numbers,” she says, “and when we looked at the carbon footprint created by deep learning systems – so many students are building these models without thinking about that. I’d like to see more guest lectures on ethics in the real world, more seminars. Deep learning poses all kinds of ethical questions, but deep learning itself isn’t the problem.”
“Sometimes,” says Doug Woos, “students were divided by the discussions. In one, we looked at the protocols for how Google works with local law enforcement, and some students pushed back with ‘if I don’t make software like this, somebody else will’ arguments. But most told me that they really enjoyed the content and wanted to see ethics and responsibility featured in all of their STEM classes. The STAs were awesome, really helpful, and I’m proud of our work together. The assignments were practical, not theoretical – for the Google one, we were actually figuring out how to improve the process of getting to those protocols.”
Blaise Rebman, an Applied Mathematics-Economics concentrator, took 111 and 112. He says there were several classes where a significant amount of time was spent on responsible computing topics that ranged from privacy to algorithmic prediction of recidivism, and he thinks that undergraduate study is the perfect moment for the initiative: “College was my first exposure to CS, so it was an ideal time. The idea that certain algorithms help us be more productive as a society in some ways has led to a lot of ethical considerations being bulldozed, and when we had small group discussions, everyone was really engaged – maybe because they had a similar experience to mine.”
The opportunities to discuss responsible computing in CSCI 1420 Machine Learning are numerous, and their impacts can be massive, notes Professor Stephen Bach. To understand the implications of proper modeling, he tells us, look at the earthquake probability models that preceded the Fukushima nuclear disaster.
“Or think about fairness,” Stephen says. “We all agree that algorithms shouldn’t discriminate, but how do you quantify what’s fair? I was lucky to have three STAs who had taken the class before, and they helped add some extremely important content. One of the activities they created was about a job search platform where machine learning was deciding who got recommended for a job and who didn’t. They explained the dangers of training models on already-labeled data and reinforcing historically biased results. They also showed that an unaware approach, which doesn’t look at race or gender, can be biased. I was so pleased with the passion of my STAs and how they wanted to do this in a high-quality way. They were finding examples that I hadn’t found on my own – really thinking, really learning.”
Lena Cohen was an STA for Kathi’s CSCI 0180. She explains that ethical content spanned the course, from an initial project on building a search engine to a reading assignment at the end of the semester where students looked at the threat of data voids and misinformation, proposing a design change or algorithmic change to address the issue. “Students were detailed, thoughtful, and took things very seriously,” she says.
But as the semester progressed, she found herself revisiting her expectations as an STA. Others were doing the same. “Seeing a student’s three-paragraph response to an assignment was amazing,” says Heila Precel, a CSCI 0320 STA, “but in retrospect, that’s only because we were expecting that they wouldn’t respond so thoroughly.”
“We learned to ask more follow-ups,” says Lena. “We saw how important it was to require students to justify their ideas.”
One of Kathi’s favorite moments, from CSCI 0111, was a multi-assignment activity that looked at ads being matched to users with increasing levels of sophistication. It was a small success, she says, but the Socially Responsible Computing program had some much bigger ones.
“The sheer fact that we launched the program is a triumph,” she explains. “We had ten faculty working closely with STAs, we got everything off the ground in a way that was tied in with our culture, and we really tapped a nerve among undergrads. Many courses (whether or not they had STAs) now have some great assignments: Shriram [Krishnamurthi] on accessibility and internationalization, Seny on data structures – these are all solidly technical problems. What we do well at Brown CS is think hard about these questions and then get that across to students.”
We're Going To Equip Them
In early September of 2019, Lena Cohen found herself in Kathi Fisler’s office for an advising meeting. When the conversation took a turn to responsible computing, Fisler proposed the idea of collaborating on a new research project. Cohen was enthused: she enlisted classmates Heila Precel and Hal Triedman (all three were former or current STAs), and what began as creating a learning progression for tech ethics education became a 38-page critical analysis of the Socially Responsible Computing program.
Among other findings, the study revealed a widespread student belief that technological artifacts have an ethics of some kind, that they can serve as a source of injustice, and that computer scientists must consider the unintentional ramifications of their work. However, course surveys showed that positive student reception of the initiative wasn’t unanimous.
“Some of the ethical material was rated very highly,” says a faculty member, “but in general, it was more mixed, not consistently rated as highly as other material. I’m not sure why that is yet.”
For Kielan Donahue, attitudinal change is one of the Socially Responsible Computing program’s biggest challenges. “The people who take responsibility most seriously,” she says, “are the people who already care. Some students went in not caring, wrote an answer to get a good grade, and just want to go off and work for a fintech company. It’s hard to reach someone who’s already made up their mind.”
“One of our assignments,” remembers Seny Kamara, “was about lowering the page rank of a web site that contained Flat Earth theories or other misinformation. Some students thought that was censorship and objected, but even that disagreement provoked a discussion – it might appear that page rank is an objective truth to be censored or not, but that isn’t the case. Just because page rank is decided by an algorithm doesn’t mean it’s unbiased.”
But Lena Cohen sees cause for optimism amid disagreement: “We found some clear connections between problems that some students had with the content and specific issues that the department can fix. That makes solving our growing pains easier.”
Below, we outline some of the most common recommendations that students and faculty provided for improving the Socially Responsible Computing program.
“To start off,” says Kathi Fisler, “it’s hard to understand STAs apart from the UTA program. Right or wrong, HTAs have active roles in managing (and sometimes creating) assignments. Every course already has a back-and-forth between faculty and HTAs, and now we’re adding STAs to the conversation – it’s not easy.”
Some other faculty members agree, with one reporting that they could have used more information at the start of the initiative: “For instance, the process for interviewing and hiring STAs wasn’t clear enough for me.” They suggest that it would be helpful to integrate STAs into the UTA program more fully, clarify their role, further elaborate the qualifications for becoming an STA, and add detail to interviewing and hiring policies.
“I don’t think I used my STAs particularly creatively,” says Tim Nelson. “Course development was where I saw the biggest need. If I could, I’d use them to lead discussion-based sections. More clarity about interviewing and hiring them would be really helpful.”
“My STAs weren’t entirely familiar with the course,” reports one faculty member, “so they had challenges adding content, which is understandable. I think STAs would be more valuable if they had to be UTAs for the course, but with additional training. I wasn’t as clear as I could have been, so there were communication problems when the STAs talked to the HTAs instead of me. None of us knew what to expect, so there was a learning curve.”
Seny Kamara’s experience with the Socially Responsible Computing program, he says, reminds him of the many strengths of the UTA program, but also its limitations: “Being a UTA is a specific and rare opportunity to engage in a very intense way with topics, and being an STA is even more so. There’s a sense of community around UTAing that I haven’t seen elsewhere – at other universities, I was just a number. On the other hand, we have to work hard not to overburden students who may be feeling like they’re spending their whole life in the CIT.”
After an improved integration of STAs into the UTA program, what next? “This time around,” says former Head STA Stanley Yip, “the STAs were almost like a temporary task force, but now we need to think critically. What does integrating this content mean? Courses like 15 and 17 are already so packed with information that it’s hard to ask them to add more content. I’d like to see a complete rethink of how responsible computing is covered across the entire spectrum of Brown CS classes.”
Kathi Fisler feels similarly, citing a lack of coordination across courses: “If we’re not careful, everyone ends up doing algorithmic bias!” It’s a recommendation that Hal Triedman, Heila Precel, and Lena Cohen heard repeatedly from their interviewees: coordinate learning goals across all courses to minimize repetition, with a single faculty member leading the initiative.
“In other words,” says Jessica Dai, “there needs to be a full set of learning goals that spans all classes. This lets the STAs strategize and start early. Otherwise, you have the UTAs sitting in TA camp for eight hours a day, plus weekends, just to keep a course in shape, and then the STAs come in at the last minute and change everything. It’s hard to contribute when all the course planning has already happened.”
Seny Kamara sees creating more structure for the program as a necessary next step. “We need to flesh that out,” he says. “Many of the issues we’re dealing with can be very nuanced, with no clear answers, and sometimes can even have conflicting answers. The STAs need a concrete structure to work from, and it’s our job to equip them with one.”
Structure and coordination across courses, says Doug Woos, is necessary for raising expectations on what students can produce: “It takes a lot of work to go beyond reading assignments and discussion questions, but we can do that by weaving assignments into labs and applications once we see the larger picture. That’s what really lets us make things real-world, which is so important. Right now, a lot of the readings we give students on responsible computing are articles written by journalists for a general audience – I’d like to see us incorporate more academic papers, or articles assuming a CS-fluent reader.”
And one of the initiative’s greatest challenges, says Jessica Dai, is that it relies largely on undergraduates creating content for other undergraduates. “Why should we know more than anyone else?” she asks. As one solution, she recommends training for both STAs and faculty: “We need more robust, more formal education. Brown CS should talk to professionals across various departments, gathering expertise and experience from all their areas.”
Heila Precel agrees: “At this point, students aren’t fully equipped to create content. As a next step, we should hire a trained professional not just for understanding ethical issues but to teach ethics pedagogy, content development.”
“When you look at Hal, Heila, and Lena’s report,” Kathi says, “the STAs were frustrated with faculty not having a background in teaching ethics. Right now, we’re not qualified to teach about racism or oppression, but there are people who are, right here on campus, so we need to learn from them.” She mentions that in many cases, faculty set STAs loose without guidelines, so the STAs weren’t sure about the changes they could or should make to courses.
As a response, Seny Kamara and others suggest that training for STAs and faculty should include a focus on establishing standards, defining pedagogy, and creating a framework for STA/faculty collaboration.
“Faculty need standard templates for how to work with STAs,” he says. “There could be three or four different scenarios, depending on the type of class. STAs need to know how to deal with controversies that arise, and faculty have to be involved. Ultimately, we’re responsible – my long-term view is that the Socially Responsible Computing program needs to be informed by faculty. A minimum training for everyone involved is extremely important, to make sure we all have the same knowledge of how to handle certain situations. And it could include prerequisites that can vary: if the course is about cryptography, there could be a surveillance prerequisite, or a social justice one from the Department of Africana Studies.”
“I don’t know if any of us are experts!” says Stephen Bach. “My STAs turned to scholarly sources, just as I would have done. More training is essential, but the pedagogy of this is very new. Next year, I’m interested in how we close the loop and evaluate students better, creating assignments with clear and objectively correct answers: ‘prove this theorem’ versus ‘what are the implications of using this data in your algorithm’. We’re still figuring out how to evaluate that, and we have to keep learning from people who have ethics as their primary focus. It’s hard to do, but I’d like to see the program keep pushing.”
“We’re already at fourteen courses and thousands of students taught,” says Ugur Çetintemel. “That alone is a success. Now there’s a lot of revamping to do, and we’re still answering the question of the most effective way forward. More coordination, more structure, better training, better pedagogy and metrics – all these are going to help us integrate responsible computing into our curriculum more naturally.”
One of Sharon Lo’s biggest hopes for the Socially Responsible Computing program is for students to see that some of the world’s largest companies can change: “It’s worth having these discussions to remind students that these are real-world issues and change is possible – for example, I work for a for-profit company, but we stopped selling facial recognition to law enforcement. We need to get students into an ethical practice. It’s the opposite of the ‘move fast and break things’ philosophy that ends up breaking our culture, the fabric of society. I’d also add law and policy to the curriculum. We really need people who can be a bridge between tech and policy.”
We Need To Sustain This
As this article goes to press, the heat of August has finally relented. The new semester is about to begin. One year into the Socially Responsible Computing program, the scope and seriousness of the initiative and its content are a common theme for Brown CS students and faculty.
“The Socially Responsible Computing program is necessary pushback,” says Tim Nelson, “against all the years that our field spent running blindly after what Oppenheimer called ‘technically sweet’ problems. I once sat in on a lecture by John Hughes where he argued that CS has had as much impact on humanity as the discovery of fire, and I agree with him – there’s nothing alarmist in taking a very serious look at tech that’s had so broad and sometimes so subtle an influence, and that’s what we’re doing.”
But two events, both originating only months in the past and both still continuing, are giving the Brown CS community pause. One of them is the emergence of a pandemic that’s had an enormous impact not just on aspiring computer scientists but on learners everywhere, and of every age.
“A time-crunch due to COVID caused me to scrap some of the ethics content that my STAs and I created,” says one faculty member. “That was really unfortunate. My fear is that in times of crisis, ethics and responsibility can be seen as discretionary, something nice to have but not essential. We’re making progress, but we can’t lose our focus.”
In the year ahead, Kathi Fisler says that the Socially Responsible Computing program will make increasing use of peer review and anonymous feedback, two valuable tools to ensure that the initiative’s focus isn’t lost and its progress continues: “They’re going to be important methods to engage students with the content during times of remote learning. It’s a great way to get everyone talking.”
“The entire conversation of the Socially Responsible Computing program needs to get bigger,” recommends Purvi Goel. “Having a discussion is always better than just answering a question on an assignment, turning it in, and having things end there – it always adds something. I’d like to see discussions not just outside of class but across campus and beyond, making people think and engage even more.”
Lena Cohen says that she doesn’t know all the changes that COVID-19 might bring. “But my personal hope,” she says, “is that if there’s less opportunity for course development, it means more time to carefully and critically create a foundation for the initiative: build an overall structure, nail down learning goals, have more collaboration with external experts – all the things it needs.”
Hal Triedman agrees: “Large-scale curriculum changes with concrete learning objectives don’t happen overnight. You need five or ten years to change things that completely, but I’m blown away at the amount of cultural change I’ve already seen in my four years at Brown.”
The other event still resounding at Brown CS and elsewhere is the nationwide call for anti-racist action that’s followed the killings of Ahmaud Arbery, George Floyd, and Breonna Taylor. Although none were directly related to computer science, a common thread among them is the public’s unwillingness to settle for small or symbolic actions taken in response to any systemic injustice or ethical failure. “All eyes are on us,” says one faculty member, “and the bar has risen considerably.”
“With each of these tragedies,” says Hal Triedman, “it becomes more and more clear how technology is intimately involved with state systems and exerting power. As a group, the STAs can help provide the knowledge and moral clarity to lead, to call our field into account for not addressing these problems. We need to make sure they have the right kind of training to have those conversations, but we all need more facility for putting those conversations in motion. The complexity of this issue shouldn’t prevent us from engaging with it.”
Lena Cohen believes that these are human rights issues, not political ones: “They show us that there are challenges outside this initiative. For instance, I’m very interested in the question of how to increase the number of students and faculty from historically underrepresented and marginalized groups.”
Ugur Çetintemel explains that the Socially Responsible Computing program is still an experiment, but he’s not aware of any analogous program at the same scale. “We’re breaking ground,” he says. “Nobody else is doing this over a four-year curriculum. Our first year was about getting things going – now we have traction, so it’s time for a significant revamping, and I’m pleased that Kathi is going to take a larger role in that. Before, we had courses working individually, but now we’re working together for clear outcomes, metrics, more structure and training, more coordination. We want to make sure we’re doing the best job possible, working in the most meaningful and effective way. Seeing proof that this is worthwhile is what gives us the confidence to move forward.”
Ugur added that in addition to having Kathi help with planning and learning progression, the initiative will also benefit from a Swearer Center representative who will help train faculty and STAs in writing and assessing reflective questions. An advanced PhD student from Brown’s Department of Modern Culture and Media will provide additional social sciences expertise.
“The past year,” says Kathi, “was an initial cut. Now, we’re agreeing on the set of learning goals, training our STAs and building an infrastructure for them, helping them communicate. We’re going to survive online learning, but we need to start thinking about learning progression and not lose the progress we’ve made with the Socially Responsible Computing program. The last thing in is the easiest thing to chop, and we need to sustain this.”
She pauses for a moment. “At this stage in my career, not much is daunting, but I’ll tell you what is. It’s knowing as the semester is about to begin that I’m not fully qualified to drive – but nobody in our department or almost any CS department is! Brown CS is stepping up, admitting that we’ll screw some things up but that we’ll keep going. Saying ‘but I’m not qualified to talk about this’ isn’t an excuse any more – we’re in the moment where you have to get out of your chair and make yourself qualified, and that’s what we’re doing.”
For more information, click the link that follows to contact Brown CS Communication Outreach Specialist Jesse C. Polhemus.