Even before the release of “the Facebook papers,” three Stanford professors offered their own reform suggestions for big tech.

Cover of 'System Error' book (via Amazon)In September philosopher Rob Reich, computer scientist Mehran Sahami, and political scientist Jeremy Weinstein released a new book titled “System Error: Where Big Tech Went Wrong and How We Can Reboot.”

A recent headline in The New Yorker described it as “Stanford Takes on the Techlash,” arguing that the world’s faith in big tech was shaken by a 2016-era realization that “along with the upsides of the digital revolution, such as more efficient burrito delivery, might come downsides, such as the unraveling of Western democracy.”

The book argues that it’s the programmers themselves — and their focus on optimization — who inevitably combine with society’s larger “aspiration to maximize profit and scale” (and the accompanying tech monopolies) that ultimately are creating a slew of unintended problems.

And then, the authors offered their prescription for fixing things.

Problem Solvers

So who are these ambitious problem solvers?

Sahami worked at Google when it was still a startup (recruited to the company by co-founder Sergey Brin). Sahami worked on applications “now used by billions of people” (including spam filters for email), then returned to Stanford in 2007 as a CS professor who “wants technologists to understand that the decisions they make in producing code have real social consequences that affect millions of people.”

But the book also needed the perspective of Weinstein — who, before his current role at Stanford, worked in government positions in the White House (during the Obama administration) and with the U.S. ambassador to the United Nations.

“[J]ust as policymakers are ignorant of technology in many ways,” the book claims, “technologists are naive about and perhaps even willfully blind to the importance of public policy and the ways that social science can help us understand, anticipate, and even mitigate the impacts of technology on society.”

And a third perspective was also critical — that of Reich, who leads the university’s Center for Ethics in Society and Institute for Human-Centered AI — and who, the book states, “brings a Socratic orientation, asking probing and uncomfortable questions designed to shake up the perspective of the technologist.”

Stanford professor Rob Reich addresses a group of students

Stanford political science professor Rob Reich asked “probing and uncomfortable questions” of technologists in “System Error.”

The book’s preface argues that Reich ultimately wants to challenge how engineers see their own role. “It’s not enough to be a problem solver without asking deeper questions: is this problem worth solving? Are there particular ways we should solve it given the things we value?”

Read More:   Fiberplane’s Collaborative Notebooks for Incident Management – InApps Technology 2022

Referring to themselves as “the technologist, the policymaker, and the philosopher,” the books describes how together the three professors designed a popular course about technological changes, covering both its politics and its ethics.

The New Yorker tracked down Stanford’s 2019 announcement of the course (which 300 students took). Instead of “move fast and break things,” the Stanford News’ ’headline said, the course “urges students to move responsibly and think about things.”

The book evolved out of the authors’ classroom experience, and opens with “concerning patterns” they’d observed earlier at Stanford. “Innovation and disruption were the buzzwords on campus, and our students broadcasted an almost utopian view that the old ways of doing things were broken and technology was the all-powerful solution.

“Perhaps most disconcerting, the enthusiasm for the digital economy and the moneymaking pipeline from Stanford to Silicon Valley was not tempered by critical reflection on just whose problems were being solved (and whose were ignored), who was benefiting from innovation (and who was losing), and who had a voice (and who remained unheard) in shaping our technological future.”

It’s not just a Stanford problem, they write, citing optimistic tech headlines betraying “a naive optimism we have worked hard to counter in our students.”

“‘Making the world a better place’ has become more a punchline than a real mission statement for major technology companies, underscoring the difficulty many of us face in determining what is truly in the public interest.”

Has Efficiency Run Amok?

Their book’s main argument is that shaping our future begins by directing our attention to “the distinctive mindset” (and growing power) of technologists —and specifically, the mindset of optimization. (Part I opens with a quote from Aldous Huxley’s dystopian novel Brave New World”: “In an age of advanced technology, inefficiency is the sin against the Holy Ghost.”)

It’s chapter one that introduces us to the generally “efficiency-obsessed tribe of people called coders,” opening the book with an almost audacious attempt to describe the programmer of today. It delves into the origins of computer science itself — the work of decision theorist George Dantzig (also a Stanford professor) — and its prioritization of optimization in things like the “traveling salesman problem.”

Stanford Professor George Dantzig with President Gerald Ford at the National Medal of  Science ceremony in 1976.

But then the book’s authors point out that it’s only possible to optimize things you can measure or quantify in some way. They even try to supply some real-world examples of how “what began as a professional mindset for the technologist easily becomes a more general orientation of life.” (For example, the eccentric foodie who founded a multi-million dollar company selling a meal-replacing powder named Soylent.)

Read More:   Update Alteryx Integrates with UiPath, Uniting RPA and Data Pipelines

The book’s next section is titled “The Deficiency of Efficiency.”

The introduction also tells the story of Joshua Browder, the Stanford student who became a tech-company CEO after creating an app that helps challenge parking tickets called DoNotPay (with seed funding from Andreessen Horowitz).

The authors cite one profile that found the app helped eventually helped challenge more than 160,000 parking tickets successfully — keeping $4 million from going into city coffers. What does Browder dream of next? “I would like to hopefully replace lawyers with technology,” he told a reporter, “starting with very simple things like arguing against parking tickets and then moving toward things like pressing a button and suing someone or pressing a button and getting a divorce.”

In this section of the book — recently excerpted at Fast Company — the authors emphasize that Browder “is not a bad person. He just lives in a world where it is normal not to think twice about how new technology companies could create harmful effects.”

They describe him as simply “one recent example of the start-up mindset birthed at Stanford and in Silicon Valley at large,” nudged towards ambitiousness by peers and professors (as well as investors) without considering the larger good of society.

The book’s authors take Browder’s quote to its logical conclusion — and the questions it raises.

“Do we really want to live in a society where people can sue at the push of a button? Would divorce be less painful if algorithms and automated systems were making decisions about who should have custody of the kids and how shared property should be divided?”

Whose Goals Matter?

Later in “System Error,” the authors ask readers to consider whose goals are being served — a point which Browder himself recently made on Twitter.

The book is filled with examples and anecdotes. Reich once attended a small dinner hosted by a tech titan who wanted to discuss the pros and cons of founding a new nation to maximize technological/scientific progress. Reich was surprised by the response when he asked if this would be a democracy.

“Democracy? No,” the titan said. “To optimize for science, we need a beneficent technocrat in charge. Democracy is too slow, and it holds science back.”

The book sees this as a core issue: that optimizers, even well-intentioned ones, “fail to measure all that is meaningful.” (And then, later, that they ultimately just impose their own values on the rest of us.)

“A better strategy would replace the blinkered technocratic governance by coders and powerful tech companies with the messy, inefficient, yet empowering process of deciding what values to promote through what we call democracy.”

The authors’” introduction ends by arguing that their book “lays out the dangers of leaving the optimizers in charge, and empowers all of us to make the difficult decisions that will determine how technology transforms our society.”

Read More:   SCA Should Be in Your Toolbox to Address Supply Chain Risk – InApps 2022

Beyond Humanity

“System Error” is a lively read. There are pithy subheads like, “What is Measurable Is Not Always Meaningful,” “The Wild West of Data Collection,” and “Should Anything Be Beyond Automation?” There’s a whole chapter on free speech, including a subsection titled “The superabundance of speech and its consequences.”

At times, the authors’ rhetoric is bracing. (“If we accept that technology is simply beyond our control, we cede our future to engineers, corporate leaders, and venture capitalists.”) But as a solution, they propose relying on a time-honored system: “the give-and-take of democratic politics,” which “tend to decide things slowly, through deliberation and with the standing possibility of revising any past decision.”

They note everything from U.S. Senate hearings on social media to the European Union’s General Data Protection Regulation (GDPR) as signs that “a new relationship between government and the tech sector is a real possibility.”

The last chapter offers the authors’ own proposed solutions, calling for “policy changes in how we approach markets so that there are checks on corporate power and monopolistic behavior,” including data portability (between competing social media services) as well as stronger commitments to privacy of data (enforced by government agencies).

They also point out that even just the threat of anti-monopoly enforcement could change the behavior of companies, and argue for restructuring corporate governance. They state that regulation itself should become more responsive and “adaptive,” ultimately suggesting that our lawmakers should face the wrath of voters if they fail to deliver good technological outcomes.

And since the internet is global, the book’s final page notes the challenge of working with authoritarian China to still find “common rules for the digital realm.”

cat with the book System Error

The author’s cat claims his copy of “System Error.”

“System Error” concludes with a remarkable statement: that democracy itself is a kind of technology, “a design for social problem solving whose chief virtues are its defense of individual rights, empowerment of citizens’ voices, and adaptability to ever-changing social conditions …

“It has proven resilient to any number of challenges in the past. The regulation of our technological future will be its next defining challenge.”


WebReduce

  • Math education professor Jo Boaler tells Lex Fridman how she’d reform the teaching of math.
  • Reddit’s AskHistorians forum spawns an online conference exploring how videogames use history.
  • The 13th anniversary of Bitcoin celebrated with the launch of a decentralized pizzeria” named Bitcoin pizza.
  • Archive.org turns 25.
  • The Rolling Stones’ latest tech stunt: recreating a video using Boston Dynamics’ dancing robot dogs.

List of Keywords users find our article on Google: