Since the earliest days of Brown, our students have been eager to combine the pursuit of societal change with rigorous academic study, then use what they’ve learned to benefit humankind at large. In today’s world, where computation has become ubiquitous, this obligation takes on new importance. Users of social media, owners of Internet-enabled appliances, and other inhabitants of the Big Data world are increasingly being confronted with the notion of responsibility for how they use technology or allow it to be used on their behalf.
Here at Brown CS, this responsibility requires acknowledging that computer science can’t be solely a technical issue, and that its use must go beyond profit-making to address the societal issues of the day. One recent example is our participation in the Swearer Center’s Engaged Scholars Program, which in the context of CS challenges students to consider the ethical implications and societal impact of emerging technologies. Another is our Industry Partners Program, which now offers complimentary memberships to nonprofits and organizations working in the social good sector.
Last year saw yet another example when a group of CS undergraduates formed a group called CS for Social Change with the realization that computation gave them the perfect tools to serve a greater good. Their interests found a common cause with Professor and Department Chair Ugur Çetintemel, and they worked with him over a period of months to create a new a course that debuted last semester: CSCI 1951-I CS for Social Change.
This pioneering course now takes its place in a group of classes that includes DATA 0080 Data, Ethics and Society, a course offered by Brown’s Data Science Initiative; CSCI 1870 Cybersecurity Ethics; and several CS courses that include modules on responsible use of CS. Together, we can think of them as a growing emphasis within the Brown CS curriculum that demonstrates computer science’s potential to bring about greater social awareness and societal good. We’ll examine each course in more depth below and devote a series of future articles to the other programs above.
CSCI 1951-I CS for Social Change
“Even before many of us came to Brown,” says Head TA Nikita Ramoji, “we were thinking about the societal impact of CS. Software is very naturalized in the Western world, and it has the potential for reaching all levels of society equally.” Together with student Elaine Jiang, she started coming up with ideas for material that would unite computer science and social good soon after arriving in Providence.
Initially, Nikita and Elaine had thought that the content might work well as an extracurricular activity. But after talking to Brown seniors, the two students found that many of them wanted to be involved with social change but hadn’t found a clear avenue to pursue it. Many students who were anticipating careers in industry explained that if they had learned more about the societal impact of computer science, they might have chosen a different path.
And so they decided to approach Ugur, offering their assistance with material that he’d been working on for some time. The result was a course with student input at every level, from the syllabus to the assignments. “We didn’t know how things would work out,” Nikita says, “but Brown students are passionate about causes, and they want to use their privilege to make a positive impact. That’s really rare. We were so happy to find a cohort of students who agree that a fundamental part of CS education should be learning about the responsible use of products that will affect so many people.”
The course focused on ongoing project collaborations (essentially, remote internships) with three nonprofits: UTA Elaine Jiang’s group worked with Baker Ripley, building a web portal that aims to improve communications between the elderly and people who have been recently placed in the stressful, emotional role of caregivers, Head TA Nikita Ramoji’s team worked with Uplift to build a Chrome extension to autofilter content and prevent online harassment, and UTA Yuta Arai led a group of students who worked with YGA, a Turkish non-profit organization, to build a web portal accessible by a mobile app. The app interfaces with kits containing electronic hardware to help Syrian refugee children learn STEM concepts.
The class alternated between hands-on work and an ongoing dialogue in which students asked each other questions and examined issues surrounding taking responsibility for their creations. “We wanted the course to be intersectional,” explains Nikita, “because we wanted everyone to reflect on how applying what they’ve learned makes it real and concrete. We had incredibly talented students in the class, and they had a real interest in social impact.”
One of the most powerful moments, Nikita says, was looking at the midpoint presentations from each team. “It was amazing to see three separate products that will genuinely help people in the real world, outside of the classroom. As computer scientists, when we don’t remind ourselves about the rationale for what we’re doing, it’s very easy to forget the impact of our creations.”
Elaine, who will be one of the HTAs this spring, says, “This year, one of our primary goals is to ensure that all projects completed during the course will be immediately useful to the organizations we’re collaborating with.” Heila Precel, who took the course last spring and will also serve as an HTA this spring, adds, “We’re focusing on increasing the course size and constructing readings and discussions that have greater practical relevance to students.”
CSCI 1870 Cybersecurity Ethics
“Confluence” is the word that Adjunct Professor Deborah Hurley uses to describe the multiple factors that led her to create this class, but first among them was an “overwhelming student interest” in how society is being impacted by security and ethical issues. “Internet-enabled systems are ubiquitous,” she says, “even embedded in human beings. We’re training the future creators of these systems, and we need them to be built in service of humanity.”
Debuting this past summer, the course used a number of different formats, from lectures to discussions to student presentations, with carefully-selected readings as accompaniment throughout. Students ranged from first-years through seniors, plus some high school students, and to Deborah’s delight (“Interdisciplinarity is one of my religions!”), they came with varied interests: computer science, materials engineering, applied math, and more. There were multiple students who hadn’t chosen a concentration yet and others from RISD. “Having this variety lets us cross-educate one another,” Deborah says, “and we need an interdisciplinary approach to solve societal problems. This was a cohesive group of students that mixed really well.”
One of them was Alec Fujii, an international student from Tokyo who’s concentrating in engineering. “We had people from all over the world,” he says, “with many different backgrounds. It was really great to see what their perspectives were, and we need that because these problems are so complicated. For me, the main idea of the course is that as technology continues to evolve, how do we cope? How do we keep our rights and values as human beings when bioanalytics and surveillance try to turn us into numbers or make it harder to keep our identities to ourselves?”
And surveillance and bioanalytics are just two of the topics covered in the course. Deborah explains that most students come to Brown with very little ethical or philosophical training, so the first part of the class is devoted to examining the dominant ethical frameworks of this century and the last: utilitarianism, Kantianism, Aristotelian value ethics. With that under their belt, students move through a wide selection of issues at the intersection of technology and ethics, including privacy, access to government information, hate speech, fake news, encryption, and more.
To move the focus away from lecturing, student presentations make up a considerable portion of the course, which is structured around a series of milestones and Deborah providing feedback after each. “The last thing I want is for students to just be inhaling and regurgitating information,” she says. “My big emphasis is making sure that students learn, that they have one of those aha moments of real insight. I want their projects to be memorable for them, and I was really impressed with the quality and depth that I saw.”
According to Alec, the format of the class had a huge impact. “Most of my engineering classes are lecture-based and this was really my first discussion-based class at Brown. The issues aren’t as black-and-white, so we had to analyze how we look at a problem. And there are always multiple solutions, so we had to think about which would benefit or affect the most people.”
Deborah will continue the class in the summer as well as during the academic year. “Brown really emphasizes the liberal arts as our crown jewel,” she says, “and students from across the university need this material as they choose concentrations. It’s eye-opening, it’s full of things they’d never even thought or heard about. I feel very strongly about ensuring that the next generation both lives happily and benefits society, and this is an important set of issues. It’s vital that students have exposure to them before they leave. Something’s missing in their education if they don’t have that.”
DATA 0080 Data, Ethics and Society
Early in the semester, Professor Roger Blumberg asked his class to write an answer to the following question: “Imagine your typical day at Brown, from the time you wake up and the time you go to sleep; and then imagine you were at Brown thirty years ago. What do you think would be the three most significant differences in your experience as a student at Brown?” What’s really striking, he says, is that if you posed this question in 1987, looking back at to 1957, very few answers would have to do with communication. In 2018, they all do: nearly all the students mention their use of phones and laptops.
Designed as an undergraduate complement to a component of the Data Science Initiative’s Master’s program, Roger describes Data 80 as a liberal arts seminar: students read some of the great work that’s been done on technology and society, write responses to prompts, discuss and debate issues in class.
“I like to start with something concrete,” Roger says, “because I want students to become critical of not just theory but everyday practices. Introducing humanities texts in computer science is challenging – getting people to suspend their disbelief and consider technology in different ways. Sometimes students confuse being critical with being cynical, but important texts help make the distinction clear.”
“I’d never taken any classes in formalized philosophy or morality,” says Erin Bugbee, one of Roger’s students. (She’s pursuing a double concentration in Statistics and Behavioral Decision Sciences.) “Personally, I really enjoyed reading texts in a completely new area. It’s not just about having more material, it’s about having a new perspective, and I wondered why more students in the hard sciences aren’t interested in these issues. I definitely feel like they’d benefit from learning about them.”
Roger notes that some of the older students, whose perspective has already been shaped by time spent in industry, had some interesting observations that differed from those of their younger peers. “It may seem like a harsh word to use these days, but the conflict of ideas is essential for the liberal arts. Is a smartphone a wonderful portal to a wider world or a vortex from which energy and attention will never emerge? The notion of conflict may be difficult for us, but if we minimize it, we eliminate the thresholds for understanding that are so important in education. My goal for the class is for everyone to develop or improve on a free relationship to technology, to take control of technology and not be unwittingly dominated by it.”
Ideas in conflict necessarily produce some discomfort, but Erin didn’t find that to be a problem. “I like how the course exposed you to both the good and the ugly,” she says. “I think our general consensus was that data science isn’t inherently dangerous, but it can be if certain things aren’t accounted for. Since I took the class, I look at projects I’m approaching in a different way and really looking at their possible effects on society. When I was a first-year student, I wanted to be a data scientist, and I still do, but I see it differently.”
That change of perspective from someone who intends to use data in their professional life is exactly what Roger is hoping for. “This class looks at how professionals think about ethics,” he says, “and in the case of data science professionals, the consequences of their actions can be significant. Yes, the possibility of using computer science for social good is very real, but big questions need to be talked about between fields. Fortunately, we have wonderful students. The open curriculum encourages them to work across disciplines, and they do. Thirty years ago, I don’t think computer science departments had students of this caliber – they would’ve gone into medicine or law. Now they gravitate to CS.”
For Ugur, there’s a real thrill in being able to work with those students, and to see an interest in societal good shared by Brown, our faculty, and the student population. “I’m delighted to see us emphasize the use of technology to benefit society,” he says. “Our students are concerned about societal change, and so are we. In my course, and in all of these courses, we’re creating models for how to teach applied computer science and how to use CS to have tangible, meaningful results that can have a profound impact on people’s lives. We are planning to expand our offerings in this area and integrate these topics deeper and more comprehensively in our curriculum.”