To Really 'Disrupt,' Tech Needs to Listen to Actual Researchers

The stereotype of the visionary male founder dominates Silicon Valley. The “move fast and break things” culture rewards those who announce promising new directions with confidence, often neglecting existing resources. It’s how the Valley has disrupted business and society for decades.

Earlier this month, Tristan Harris, cofounder of the Center for Humane Technologies, proposed a whole new field of study: "Society & Technology Interaction." The engineers building the technologies we all rely on, he argued, lack social and cultural knowledge. The problem: That well-established field already exists.

We don't need a new discipline. We need to respect, sustain, and strengthen the ones we already have. In fact, college students can learn about these issues in a wide range of existing fields: science and technology studies (STS), communication, sociology, anthropology, political science, cognitive science, and digital humanities, to name a few. Informatics and information science programs dig into the mysteries of databases and information systems and their power in society, too.

WIRED OPINION

ABOUT

Lilly Irani is a professor of Communication and Science Studies at the University of California, San Diego and the author of Chasing Innovation: Making Entrepreneurial Citizens in Modern India. Rumman Chowdhury is the global lead for Responsible AI at Accenture.

The problem is a Silicon Valley culture that celebrates the technical over the social. We disparage things we think are easy by declaring “It’s not rocket science.” We call some knowledge "hard" and some "soft," and we value the former over the latter. Computer science and data science curricula regularly have one society or ethics class, but the rest of the pedagogy demands that students program computers as if people do not matter.

In particular, the tech industry—organized to hack and disrupt—has been late to appreciate and understand the potential negative implications of algorithms. The story of algorithmic bias is the narrative of the poor and working class, of minorities who are subject to historical systemic and institutionalized racism. These stories are far removed from the daily lives of the residents of Palo Alto, where the median home price is $1.99 million (91.4 times the national average) and the population is 55 percent white.

Such algorithmic bias is evident in Oakland, where a simulation of the commonly used PredPol police deployment algorithm illustrated a higher propensity to send police officers to neighborhoods with high proportion of people from racial minorities, independent of true crime rates. It's apparent in parts of Los Angeles, where the LAPD scrapped their crime data programs after an internal audit revealed discriminatory data flaws.

As companies inject algorithms into public life, it can be hard to grasp what is going on. Our newsfeeds circulate misinformation, our job ads are biased, and our algorithms are not socially accountable. Who can possibly be expert in unpacking the impact technologies have in different cultural settings, on already marginalized people, and on our attention spans? (Hint: It's not app developers or venture capitalists.)

This is an instance of celebrating the new while neglecting to see—and even quietly colonizing—what already exists. Harris' tweet calling for the study of Society & Technology Interaction ignores long-established fields. Often, optimistic and enthusiastic entrepreneurs announce solutions before they have done enough questioning and listening. What if, instead, Harris had asked how to channel more engineers into social studies of computing, or how to amplify socially informed approaches to design?

Students in computing related fields need cross-training in social sciences and the humanities. They need the kind of training that equips them to see the social and values critical thinking. The beautiful messiness of humanity cannot be quantified into data columns, programmed, scaled, and engineered. We need engineering programs that cultivate respect for other kinds of experiential expertise—educators, nurses, Uber drivers, and welfare recipients, for example.

Above all, successfully implementing technology requires more than socially-wise engineers. Technology workers' recent protests prove that it takes more than ethical judgement to implement ethical technologies. Workers across the industry sounded alarm bells over the last year as part of the #techwontbuildit movement. Tech workers have begun to protest unethical technologies by producing letters, petitions, work slowdowns, and even walkouts. Ethics guide our sense of right and wrong, but it does not empower democracy in the tech world. For this, students in computing-related fields need a sense of history and society that exceeds design or behavioral knowledge.

What we need is an approach that empowers citizens, civic organizations, and advocacy groups to take collective action. We will make progress when we collaborate with impacted communities to identify the issues that matter most and arrive at inclusive solutions. Community organizations already exist to combat the very real institutional and systemic injustices that plague vulnerable communities. We do not need to clarify the technology, we need to identify how technology might exacerbate those challenges.

We should educate all—not just engineers—in how to become active technological citizens. This does not mean learning how to code. It means knowing enough to ask questions about how the code affects work, life, and community. It means understanding how people can hold institutions and firms accountable through literacy, advocacy, and social movements.

Real problem solving means engaging these impacted communities at the design phase and beyond, even if it means putting an end to some projects. Rather than leading with tech principles, developers must work with experienced groups to develop true "human-centric" design—in which the human is more than just the average Silicon Valley resident.

We all have a stake in shaping the future of technology. The last thing we need is a new set of experts. We need to learn from history and the previous contributions of others.

Click Here: cheap pandora Rings

WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here. Submit an op-ed at [email protected]