Three Stanford professors want people to press control-alt-delete on how we think about our relationship to Big Tech. In a new book, they seek to empower all of us to create a technological future that supports human flourishing and democratic values. Watch the video here.
By Melissa De Witte
Technology is such a ubiquitous part of modern life that it can often feel like a force of nature, a powerful tidal wave that users and consumers can ride but have little power to guide its direction. It doesn't have to be that way.
Rather than just accept the idea that the effects of technology are beyond our control, we must recognize the powerful role it plays in our everyday lives and decide what we want to do about it, said Rob Reich, Mehran Sahami and Jeremy Weinstein in their new book System Error: Where Big Tech Went Wrong and How We Can Reboot (Harper Collins, 2021). The book integrates each of the scholars' unique perspectives - Reich as a philosopher, Sahami as a technologist and Weinstein as a policy expert and social scientist - to show how we can collectively shape a technological future that supports human flourishing and democratic values.
Reich, Sahami and Weinstein first came together in 2018 to teach the popular computer science class, CS 181: Computers, Ethics and Public Policy. Their class morphed into the course CS182: Ethics, Public Policy and Technological Change, which puts students into the role of the engineer, policymaker and philosopher to better understand the inescapable ethical dimensions of new technologies and their impact on society.
Now, building on the class materials and their experiences teaching the content both to Stanford students and professional engineers, the authors show readers how we can work together to address the negative impacts and unintended consequences of technology on our lives and in society.
"We need to change the very operating system of how technology products get developed, distributed and used by millions and even billions of people," said Reich, a professor of political science in the School of Humanities and Sciences and faculty director of the McCoy Family Center for Ethics in Society. "The way we do that is to activate the agency not merely of builders of technology but of users and citizens as well."
How technology amplifies values
Without a doubt, there are many advantages of having technology in our lives. But instead of blindly celebrating or critiquing it, the scholars urge a debate about the unintended consequences and harmful impacts that can unfold from these powerful new tools and platforms.
One way to examine technology's effects is to explore how values become embedded in our devices. Every day, engineers and the tech companies they work for make decisions, often motivated by a desire for optimization and efficiency, about the products they develop. Their decisions often come with trade-offs - prioritizing one objective at the cost of another - that might not reflect other worthy objectives.
For instance, users are often drawn to sensational headlines, even if that content, known as "clickbait," is not useful information or even truthful. Some platforms have used click-through rates as a metric to prioritize what content their users see. But in doing so, they are making a trade-off that values the click rather than the content of that click. As a result, this may lead to a less-informed society, the scholars warn.
"In recognizing that those are choices, it then opens up for us a sense that those are choices that could be made differently," said Weinstein, a professor of political science in the School of Humanities & Sciences, who previously served as deputy to the U.S. ambassador to the United Nations and on the National Security Council Staff at the White House during the Obama administration.
Another example of embedded values in technology highlighted in the book is user privacy.
Legislation adopted in the 1990s, as the U.S. government sought to speed progress toward the information superhighway, enabled what the scholars call "a Wild West in Silicon Valley" that opened the door for companies to monetize the personal data they collect from users. With little regulation, digital platforms have been able to gather information about their users in a variety of ways, from what people read to whom they interact with to where they go. These are all details about people's lives that they may consider incredibly personal, even confidential.
When data is gathered at scale, the potential loss of privacy gets dramatically amplified; it is no longer just an individual issue, but becomes a larger, social one as well, said Sahami, the James and Ellenor Chesebrough Professor in the School of Engineering and a former research scientist at Google.
"I might want to share some personal information with my friends, but if that information now becomes accessible by a large fraction of the planet who likewise have their information shared, it means that a large fraction of the planet doesn't have privacy anymore," said Sahami. "Thinking through these impacts early on, not when we get to a billion people, is one of the things that engineers need to understand when they build these technologies."
Even though people can change some of their privacy settings to be more restrictive, these features can sometimes be difficult to find on the platforms. In other instances, users may not even be aware of the privacy they are giving away when they agree to a company's terms of service or privacy policy, which often take the form of lengthy agreements filled with legalese.
"When you are going to have privacy settings in an application, it shouldn't be buried five screens down where they are hard to find and hard to understand," Sahami said. "It should be as a high-level, readily available process that says, 'What is the privacy you care about? Let me explain it to you in a way that makes sense.' "
Others may decide to use more private and secure methods for communication, like encrypted messaging platforms such as WhatsApp or Signal. On these channels, only the sender and receiver can see what they share with one another - but issues can surface here as well.
By guaranteeing absolute privacy, the possibility for people working in intelligence to scan those messages for planned terrorist attacks, child sex trafficking or other incitements of violence is foreclosed. In this case, Reich said, engineers are prioritizing individual privacy over personal safety and national security, since the use of encryption can not only ensure private communication but can also allow for the undetected organization of criminal or terrorist activity.
"The balance that is struck in the technology company between trying to guarantee privacy while also trying to guarantee personal safety or national security is something that technologists are making on their own but the rest of us also have a stake in," Reich said.
Others may decide to take further control over their privacy and refuse to use some digital platforms altogether. For example, there are increasing calls from tech critics that users should "delete Facebook." But in today's world where technology is so much a part of daily life, avoiding social apps and other digital platforms is not a realistic solution. It would be like addressing the hazards of automotive safety by asking people to just stop driving, the scholars said.
"As the pandemic most powerfully reminded us, you can't go off the grid," Weinstein said. "Our society is now hardwired to rely on new technologies, whether it's the phone that you carry around, the computer that you use to produce your work, or the Zoom chats that are your way of interacting with your colleagues. Withdrawal from technology really isn't an option for most people in the 21st century."
Moreover, stepping back is not enough to remove oneself from Big Tech. For example, while a person may not have a presence on social media, they can still be affected by it, Sahami pointed out. "Just because you don't use social media doesn't mean that you are not still getting the downstream impacts of the misinformation that everyone else is getting," he said.
Rebooting through regulatory changes
The scholars also urge a new approach to regulation. Just as there are rules of the road to make driving safer, new policies are needed to mitigate the harmful effects of technology.
While the European Union has passed the comprehensive General Data Protection Regulation (known as the GDPR) that requires organizations to safeguard their users' data, there is no U.S. equivalent. States are trying to cobble their own legislation – like California's recent Consumer Privacy Act - but it is not enough, the authors contend.
It's up to all of us to make these changes, said Weinstein. Just as companies are complicit in some of the negative outcomes that have arisen, so is our government for permitting companies to behave as they do without a regulatory response.
"In saying that our democracy is complicit, it's not only a critique of the politicians. It's also a critique of all of us as citizens in not recognizing the power that we have as individuals, as voters, as active participants in society," Weinstein said. "All of us have a stake in those outcomes and we have to harness democracy to make those decisions together."
System Error: Where Big Tech Went Wrong and How We Can Reboot is available Sept. 7, 2021.