This blog was authored by UC Berkeley Othering and Belonging experts Emnet Almedom, Nicole Montojo and Eli Moore. The ideas expressed in this post are not necessarily those of the Othering & Belonging Institute or UC Berkeley, but belong to the authors.
What would it take to collectively own our data? How could we regulate the environmental impact of resource-intensive technologies? These are just some of the big questions about technology facing us on a global scale.
In our 2021 landscape scan on the role of artificial intelligence in the COVID-19 era, we examined solutions across the spectrum from industry reform to government regulation to community and worker power-building. This exploratory work makes clear that no one company’s desire to do good can solve a structural problem. The conditions that brought corporations, including technology companies, to such power are shaped by public policy choices – from corporate tax breaks to anti-worker labor law. Thus, the movement to rebuild public power requires making transformational changes in the public sphere.
In this necessary transition towards an economy that centers democratic decision-making, care, and ecological flourishing, we will all play a role: workers at tech company headquarters, consumers and users of technology, workers at warehouses, contract and temp workers, and the communities with tech companies as their neighbors. This reflects the reality we see in our day-to-day work alongside a community of partners across California, across the nation, and across the globe. For example, in our narrative research in the Inland Empire, we point to the work ahead to bridge across warehouse workers and community members to find solidarity for the labor protections needed for equitable economic development.
Much of the work to envision equitable futures for the technology sector precedes us and has especially been driven by Black women, other women of color, and others with the lived experience of being targeted, misidentified, or redlined by today’s technological tools. This includes the work of scholars like Safiya Umoja Noble, Sara Roberts, Ruha Benjamin, I’Nasah Crockett, and Sydette Harry, who theorized, foresaw, and directly experienced these issues long before they were acknowledged by policymakers or industry leaders. We want to lift up some examples of such work as powerful resources to learn about visionary organizing, research, and coalition-building towards a tech sector rooted in belonging values. You’ll see a full list of our recommended readings and explorations at the bottom of this page.
But of course we want to offer a tl;dr (too long; didn’t read) version to encourage you to dig deeper into some of the visions! We lift up these six actions to make belonging real in tech as a starting list of policy, solidarity-building, and democratic power-building solutions from our expertise as social scientists and community-engaged researchers with a focus on structural racism.
- Implement equitable regional development for shared prosperity. In the regions where Google and other large tech corporations have developed their operations, the economic benefits have been unevenly distributed. Housing prices have dramatically increased, while wages of many workers not directly employed in tech have stayed low. The immense wealth that these tech corporations have accumulated should be put toward affordable housing, ensure that workers indirectly employed have living wages and benefits, and in other ways support equitable development. A clear community vision for how this would look at Google’s proposed new campus was put forward by local community leaders and, after four years of organizing, Silicon Valley Rising negotiated community benefits. The work ahead in Silicon Valley and other regions where tech companies expand is for communities to have true governing power over community benefits.
- Invest in community and collective well-being rather than surveillance technologies that maintain structures that “other.” In 2020, tech companies like Microsoft, Amazon, and Google heeded long-time calls that they pause the sale of facial recognition technology to law enforcement. But we can and should go beyond an opt-in model for serious, world-shifting decisions on the proliferation of dangerous surveillance technologies. Companies and governments investing in the production, improvement, and dissemination of surveillance technologies are taking advantage of existing breakages in our society: breakages on who is fully seen as human and deserving of care. The Algorithmic Justice League (AJL) – led by computer scientist Joy Buolamwini who co-authored a ground-breaking study on the inaccuracy of facial recognition technology – has collected case studies of resisting surveillance technology and proposals for building public oversight.
- Build transnational and cross-sector bridges to end military and police profiteering. Campaigns such as #NoTechForICE and #NoTechforApartheid make it clear that seemingly administrative choices, like immigration enforcement buying cloud technologies, are tools to expand historic criminalization of Black, immigrant, and displaced communities such as Palestinians. These campaigns build bridges and power to call tech companies to account. Students are organizing from their place of relative power as the recruits of Big Tech companies to demand a halt to practices such as ICE and local police data-sharing that undermines sanctuary city agreements.
- Create alternative public, not-for-profit, or worker-owned digital infrastructure towards the public good. Instead of today’s model of extracting data for profit, what if we democratized control of our data and the infrastructure for technology creation? Users and technologists worldwide could have the access to develop alternatives to today’s dominant platforms. We could have the freedom to shape our own social media feeds. Demos and Data For Black Lives’ report on data capitalism examines several proposals to resist data extraction and reclaim data as a tool for social change.
- Raise federal and state labor standards for temp workers and support tech temp worker organizing. More than half of Google’s workforce around the world consists of temporary, vendor, or contracted workers; a proportion that has steadily climbed since the company was founded. Contractors and employees are subject to a two-tier system of unequal treatment. This unequal treatment takes many forms, including substandard pay and benefits, vulnerability to discrimination and sexual harassment, and hyper-precarious employment, which makes the risk of retaliation for speaking out acute. Worker-organizers at Temp Worker Justice and labor law scholars at the National Employment Law Project (NELP) joined forces to make the case for building cross-racial solidarity and collective power towards better conditions for all workers.
- Protect whistleblowers in their efforts to speak up on ethical issues within their workplace and beyond. Employees with computational expertise and insider knowledge are uniquely qualified to understand and expose violations of civil and human rights embedded in opaque artificial intelligence (AI) tools. Oftentimes, the employees who expose these violations – often people of color or people with disabilities who are uniquely qualified to identify harms – are then subject to retaliation from their employer or from fellow employees. Former and current tech workers have advocated for necessary protections, such as the regulation of tech companies’ abuse of non-disparagement agreements (NDAs) via the Silenced No More Act.
Our Reading List
Want to dig deeper? Check out our list of recommended readings.
- Just Tech at the Social Science Research Council on Information Technology and Disability Justice in the United States – This review discusses access to data and technology for people with disabilities, focusing on agency and digital transinstitutionalization—the extension of institutional frameworks, such as surveillance and control, from state hospitals into community settings via data-driven technologies.
- Astraea Lesbian Foundation for Justice’s Technologies for Liberation: Fund Abolitionist Futures – This report lifts up the power of movement building and the ways in which organizers are innovating to create safe technologies and systems for the people they serve, centering QT2SBIPOC communities.
- The Disinformation Defense League Policy Platform – The Disinformation Defense League’s strategic set of solutions to quell disinformation and build a media ecosystem that serves the public interest by promoting accurate news and information, protecting civil and human rights, and fostering an informed, equitable electorate across all languages.
- Data Capitalism and Algorithmic Racism – Demos and Data For Black Lives unpack how companies use data to preserve the power imbalance that keeps them rich. This economic model is rooted in chattel slavery and relies on the extraction and commodification of data. Solutions include starting with data transparency as a first level goal, but acknowledge the limits of transparency and the importance of power distribution.
- Owning Ethics: Corporate Logics, Silicon Valley, and the Institutionalization of Ethics – Based on interviews with key informants defined as “owning ethics” in their tech organization, this critiques the effectiveness of ethics, equity, and fairness goals and makes the argument for centering human and civil rights. There’s several references to the lack of a regulatory response from the government as a major force shaping the logics of ethics work within companies.
- Private Accountability in the Age of Artificial Intelligence – OBI Faculty Scholar Sonia Katyal’s reflection and legal analysis on the paths forward for industry reform, including the role of codes of conduct, impact statements, and whistleblower protections to promote accountability from within private corporations.
- The Refusal Conference: A Reading List – A reading list composed by participants at the Refusal Conference hosted by UC Berkeley’s Algorithmic Fairness and Opacity Group (AFOG) on the history of and models exemplifying refusal of certain technologies or applications in pursuit of a more just and equitable society.
- Ours to Hack and to Own: The Rise of Platform Cooperativism, A New Vision for the Future of Work and a Fairer Internet – The activists who have put together Ours to Hack and to Own argue for a new kind of online economy: platform cooperativism, which combines the rich heritage of cooperatives with the promise of 21st-century technologies, free from monopoly, exploitation, and surveillance.