Reset: Reclaiming the Internet for Civil Society
The Unabridged Bibliography

Please see the the House of Anansi Press website for more details about the book. Ron Deibert is Professor of Political Science, and Director of the Citizen Lab at the Munk School of Global Affairs & Public Policy, University of Toronto. Full author bio can be found on his website. For more information about the Citizen Lab, please see the Citizen Lab website and Twitter.

RESET was delivered as six lectures for the 2020 CBC Massey Lectures series. Listen to all six lectures here.


Right-wing, neo-fascist populism flourishes online and off, igniting hatred, murder, and even genocide:* Venier, S. (2019). The role of Facebook in the persecution of the Rohingya minority in Myanmar: Issues of accountability under international law. *Italian Yearbook of International Law Online, 28(1), 231–248; Vaidhyanathan, S. (2018). Antisocial media: How Facebook has disconnected citizens and undermined democracy. Oxford University Press.

Shady data analytics companies like Cambridge Analytica: Cadwalladr, C., & Graham-Harrison, E. (2018). The Cambridge Analytica files. Retrieved from; J. Isaak and M. J. Hanna, “User Data Privacy: Facebook, Cambridge Analytica, and Privacy Protection,” in Computer, vol. 51, no. 8, pp. 56-59, August 2018, doi: 10.1109/MC.2018.3191268

Indeed, for much of the 2000s, technology enthusiasts applauded: See Diamond, L. (2010). Liberation technology. Journal of Democracy, 21(3), 69-83; Chowdhury, M. (2008, September). The Role of the Internet in Burma’s Saffron Revolution. Berkman Center for Internet and Society, 2008-8. doi:10.2139/ssrn.1537703. Retrieved from; Francisco, R. A. (2005). The dictator’s dilemma. In Repression and mobilization, 58-81. Minneapolis, MN: University of Minnesota Press; Ruijgrok, K. (2017). From the web to the streets: internet and protests under authoritarian regimes. Democratization, (24)3, 498-520. doi:10.1080/13510347.2016.1223630; Ferdinand, P. (2000). The Internet, democracy and democratization. Democratization, 7(1), 1-17. doi:10.1080/13510340008403642; Zheng, Y. (2007). Technological empowerment: The Internet, state, and society in China. Stanford, CA: Stanford University Press; Castells, M. (2015). Networks of outrage and hope: Social movements in the Internet age. John Wiley & Sons.

Others are beginning to notice that we are spending an unhealthy amount of our lives: On “socializing” while remaining alone, the classic treatment is Turkle, S. (2017). Alone together: Why we expect more from technology and less from each other. Hachette UK.

I organize these problems as “painful truths”: Deibert, R. J. (2019). The road to digital unfreedom: Three painful truths about social media. Journal of Democracy, 30(1), 25–39.

Desktop computers were eventually networked together: Abbate, J. (2000). Inventing the internet. MIT Press; Hafner, K., & Lyon, M. (1998). Where wizards stay up late: The origins of the internet. Simon and Schuster; Leiner, B. M., Cerf, V. G., Clark, D. D., Kahn, R. E., Kleinrock, L., Lynch, D. C., … & Wolff, S. (2009). A brief history of the Internet. ACM SIGCOMM Computer Communication Review, 39(5), 22-31; Naughton, J. (2000). A Brief History of the Future: The origins of the internet. London: Phoenix. See also Zittrain, J. (2008). The future of the internet–and how to stop it. Yale University Press, which effectively predicted the transition from an Internet based on networked desktop clients to gatekeeping platforms. For an examination of how Cold War social and cultural contexts helped shape computer technology and information systems in the United States, see Edwards, P. N. (1997). The closed world: Computers and the politics of discourse in Cold War America. MIT Press; and Rid, T. (2016). Rise of the machines: A cybernetic history. WW Norton & Company.

Before long, the internet was in everything: Waltz, E. (2020, January 20). How do neural implants work? Retrieved from; Strickland, E. (2017). Silicon Valley’s latest craze: Brain tech. IEEE Spectrum, 54(7), 8–9; DeNardis, L. (2018). The internet in everything: Freedom and security in a world with no off switch. Yale University Press.

Security experts have routinely discovered: Jaret, P. (2018, November 12). Exposing vulnerabilities: How hackers could target your medical devices. Retrieved from

Engineers are experimenting on systems: Moore, S. K. (2019, May 14). Wireless network brings dust-sized brain implants a step closer. Retrieved from; Makin, J. G., Moses, D. A., & Chang, E. F. (2020). Machine translation of cortical activity to text with an encoder–decoder framework (pp. 1-8). Nature Publishing Group; Velasquez-Manoff, M. (28 August 2020). “The Brain Implants that could Change Humanity,” New York Times retrieved from

We are all now “cyborgs”: Haraway, D. (1991). A cyborg manifesto: Science, technology, and socialist feminism in the late twentieth century. In Simians, cyborgs and women: The reinvention of nature *(149–181).* Routledge.

Much of it is rendered invisible through familiarity and habituation: Edwards, P. M. (2017). The mechanics of invisibility: On habit and routine as elements of infrastructure. In I. Ruby & A. Ruby (Eds.), *Infrastructure space *(327–336). Ruby Press.

Sometimes gaping vulnerabilities: Anderson, R. (2001, December). Why information security is hard — An economic perspective. Seventeenth Annual Computer Security Applications Conference (358–365). IEEE; Anderson, R. (2000). Security Engineering: A Guide to Building Dependable Distributed Systems, 3rd Edition. Hoboken, NJ: Wiley. Retrieved from

An “accidental megastructure”: Bratton, B. H. (2016). *The stack — On software and sovereignty. *MIT Press.

A bewildering array of new applications: Lindsay, J. R. (2017). Restrained by design: The political economy of cybersecurity.* Digital Policy, Regulation and Governance, 19*(6), 493–514.

Merriam-Webster defines social media: Merriam-Webster. (n.d.). Social media. In dictionary. Retrieved April 21, 2020, from

Designed secret “back doors”: On the legal implications of “remote, surreptitious brain surveillance,” see Kerr, I., Binnie, M., & Aoki, C. (2008); Tessling on my brain: The future of lie detection and brain privacy in the criminal justice system. Canadian Journal of Criminology and Criminal Justice, 50(3), 367–387.

Google’s security team says: Huntley, S. (2020, May 27). Updates about government-backed hacking and disinformation. Retrieved from

North Korea depends on the internet for illicitly acquired revenues: Sanger, D. (2020, February 9). North Korea’s internet use surges, thwarting sanctions and fueling theft. Retrieved from; See also Deibert, R., & Pauly, L. (2019). Cyber Westphalia and beyond: Extraterritoriality and mutual entanglement in cyberspace. In D. Bigo, E. F. Isin, & E. Ruppert (Eds.), Data politics: Worlds, subjects, rights. Routledge.

Offensive action . . . takes place just below the threshold of armed conflict: But not always. For exceptions and discussion, see Zetter, K. (2014). Countdown to zero day: Stuxnet and the launch of the world’s first digital weapon. Broadway Books; Greenberg, A. (2019). Sandworm: A new era of cyberwar and the hunt for the Kremlin’s most dangerous hackers. Doubleday; For a contrary view, see Rid, T. (2013). Cyber war will not take place. Oxford University Press USA; Brantly, A. F. (2017). The violence of hacking: state violence and cyberspace. The Cyber Defense Review, 2(1), 73-92. See also Maschmeyer, L. (2020). Slow Burn: Subversion and Escalation in Cyber Conflict and Covert Action (Unpublished doctoral dissertation). University of Toronto; Gartzke, E., & Lindsay, J. R. (2015). Weaving tangled webs: offense, defense, and deception in cyberspace. Security Studies, 24(2), 316-348; Lee, M. M. (2020). Crippling Leviathan: How Foreign Subversion Weakens the State. Cornell University Press..

Spreading false information is as old as humanity itself: See Posetti, J., & Matthews, A. (2018, July 23). A short guide to the history of “fake news” and disinformation. Retrieved from

Nonetheless products of history:* See Deibert, R. (1999). Harold Innis and the empire of speed. *Review of International Studies, 25(2), 273–289; Ruggie, J. G. (1993). Territoriality and beyond: Problematizing modernity in international relations. International Organization, 47(1), 139–174.

In his seminal book . . . Innis explained: Innis, H. A., & Innis, M. Q. (1972). *Empire and communications. *University of Toronto Press; See also Innis, H. A. (2008). The bias of communication. University of Toronto Press. For more on “media ecology,” and an extended elaboration of the idea of media as “environments,” see the seminal book by Meyrowitz, J. (1986). No sense of place: The impact of electronic media on social behavior. Oxford University Press. See also Poster, M. (1990). The mode of information: Poststructuralism and social context. University of Chicago Press.

The rise of individualism . . . and nationalism: Anderson, B. (1983). Imagined communities: Reflections on the origin and spread of nationalism. Verso; Deibert, R. (1997). Parchment, printing, and hypermedia: Communication and world order transformation. Columbia University Press; McLuhan, M. (1963). The Gutenberg galaxy. University of Toronto Press.

As . . . Naomi Klein has put it: Klein, N. (2019, August 21). Why the Democratic National Committee must change the rules and hold a climate debate. Retrieved from

Chapter One: The Market for Our Minds

Initial coverage of the PRISM program: Greenwald, G., & MacAskill, E. (2013, June 7). NSA Prism program taps in to user data of Apple, Google and others. Retrieved from; Gellman, B. (2020). Dark mirror: Edward Snowden and the American surveillance state. Penguin.

A top secret program called Dishfire: Poltras, L. V., Rosenbach M., & Stark, H. (2013, September 15). NSA monitors financial world. Retrieved from

A program code-named HAPPYFOOT: Soltani, A., Peterson, A., & Gellman, B. (2013, December 11). NSA uses Google cookies to pinpoint targets for hacking. Retrieved from

The NSA collected hundreds of thousands of contacts: Gellman, B., & Soltani, A. (2013, October 14). NSA collects millions of e-mail address books globally. Retrieved from

An NSA program to exploit unpatched vulnerabilities: Rosenbach, M., & Stark, H. (2013, September 9). How the NSA accesses smartphone data. Retrieved from

“Can the dot-com party go on forever? Not likely”: Yang, C. (2000, April 3). Earth to dot com accountants. Retrieved from

The stock value of the . . . top 280 internet companies: Kleinbard, D. (2000, November 9). Dot.coms lose $1.755 trillion in market value. Retrieved from; Howcroft, D. (2001). After the goldrush: Deconstructing the myths of the market.* Journal of Information Technology, 16*(4), 195–204.

Principals at Google adjusted their strategy: Levy, S. (2011). In the plex: How Google thinks, works, and shapes our lives. Simon and Schuster.

“This new form of information capitalism”: Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75–89., 75; See also West, S. M. (2019). Data capitalism: Redefining the logics of surveillance and privacy. Business & Society, 58(1), 20-41; Flyverbom, M., Deibert, R., & Matten, D. (2019). The governance of digital technology, big data, and the internet: New roles and responsibilities for business. Business & Society, 58(1), 3-19; Sadowski, J. (2019). When data is capital: Datafication, accumulation, and extraction. Big Data & Society, 6(1), 2053951718820549. For a trenchant critique of the limitations of Zuboff’s functionalist formulation of surveillance capitalism, see especially Morozov, E. (2019). Capitalism’s New Clothes. The Baffler, 4 retrieved from Engaging Morozov’s important critique fully would require a lot more effort (and space) than this manuscript provides, but suffice it to say here that it seems to me valid to suggest that Zuboff puts more emphasis on the “surveillance” and less on the “capitalism” part of “surveillance capitalism,” thus obscuring some of the profit imperatives and motives driving Big Tech platforms’ business decisions. While Zuboff has done a fabulous job highlighting the intense and seemingly incessant desire among Big Tech to extract endless pools of data from consumers, like Morozov I believe such desire can ultimately be explained by good old-fashioned power and profit motives that define capitalism throughout the ages. See also Luke, T. W. (1989). Screens of power: Ideology, domination, and resistance in informational society. University of Illinois Press.

The early modern bureaucratic state and its mass record keeping: Koopman, C. (2019). How we became our data: A genealogy of the informational person. University of Chicago Press; Bouk, D. (2015). How our days became numbered: Risk and the rise of the statistical individual. University of Chicago Press; Crosby, A. (1996). Taking measure of reality: Quantification and Western society, 1250–1600. Cambridge University Press; Porter, T. (1995).* Trust in numbers: The pursuit of objectivity in science and public life. Princeton University Press; Beniger, J. (1989). *The control revolution: Technological and economic origins of the information society. Harvard University Press; Lyon, D. (1994). The electronic eye: The rise of surveillance society. University of Minnesota Press; Standage, T. (1998). The Victorian internet: The remarkable story of the telegraph and the nineteenth century’s online pioneers. Phoenix. Gandy Jr, O. H. (1993). The Panoptic Sort: A Political Economy of Personal Information. Critical Studies in Communication and in the Cultural Industries. Westview Press, Inc.

You need different sensors and inputs . . . And to the consumer, it needs to feel like one:* *Matyszczyk, C. (2019, July 23). A Google exec admits the ugly truth about the smart home. Retrieved from See also Zheng, S., Apthorpe, N., Chetty, M., & Feamster, N. (2018). User perceptions of smart home IoT privacy. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1-20.

Our ignorance is their bliss: Zuboff, S. (2019, January 24). ‘Surveillance capitalism’ has gone rogue. We must curb its excesses. Retrieved from

Why not equip them with cameras to map the houses: Porter, J. (2019, June 21). Amazon patents ‘surveillance as a service’ tech for its delivery drones. Retrieved from

Why not vacuum up internet data: Kiss, J. (2010, May 15). Google admits collecting Wi-Fi data through Street View cars. Retrieved from

Patent applications that Facebook has made: Chinoy, S. (2018, June 21). What 7 Creepy Patents Reveal About Facebook. Retrieved from

What the academic Jan Padios calls “emotional extraction”: Padios, J. M. (2017). Mining the mind: Emotional extraction, productivity, and predictability in the twenty-first century. Cultural Studies,* 31(2–3),* 205–231. See also Patulny, R., Lazarevic, N., & Smith, V. (2020). ‘Once more, with feeling,’said the robot: AI, the end of work and the rise of emotional economies. Emotions and Society, 2(1), 79-97; and Pasquale, F. (2020). “More Than A Feeling.” *Real Life. *Retrieved from

“This information does not pass through a cognitive filter as it is created and stored”: Greene, A. K. (2019, August 12). Data sweat. Retrieved from

They say “the app can even predict how a person will feel next week”: Sheridan, K. (2018, October 4). A startup’s bold plan for a mood-predicting smartphone app is shadowed by questions over evidence. Retrieved from

Facebook has data-sharing partnerships with at least sixty device makers: Dance, G. J. X., Confessore, N., & LaForgia, M. (2018, June 3). Facebook gave device makers deep access to data on users and friends. Retrieved from

Privacy International (PI) published a report: Privacy International. (2018, December). *How apps on Android share data with Facebook (even if you don’t have a Facebook account). *Retrieved from

PI examined an app called Maya: Privacy International. (2019, September 9). No body’s business but mine: How menstruation apps are sharing your data. Retrieved from

Browsers can tell a lot about a user: Price, D. (2018, October 12). 10 types of data your browser is collecting about you right now. Retrieved from See also Upathilake, R., Li, Y., & Matrawy, A. (2015, July). A classification of web browser fingerprinting techniques. In 2015 7th International Conference on New Technologies, Mobility and Security (NTMS) (pp. 1-5). IEEE; Fifield, D., & Egelman, S. (2015, January). Fingerprinting web users through font metrics. In International Conference on Financial Cryptography and Data Security (pp. 107-124). Springer, Berlin, Heidelberg.

Researcher Sam Jadali found that Nacho Analytics: Fowler, G.A. (2019, July 18). I found your data. It’s for sale. Retrieved from

Jadali . . . could nonetheless access usernames, passwords: Jadali, S. (2019, July 18). DataSpii: The catastrophic data leak via browser extensions. Retrieved from See also Beggs, A., & Kapravelos, A. (2019, June). Wild Extensions: Discovering and Analyzing Unlisted Chrome Extensions. In International Conference on Detection of Intrusions and Malware, and Vulnerability Assessment (pp. 3-22). Springer, Cham.

As noted in Air Canada’s privacy policy: Air Canada. (2019, October 8). Privacy policy. Retrieved January 2019 from

Tala’s CEO said that “repayment of a loan . . .”: Aglionby, B. (2016, July 5). “US fintech pioneer’s start-up in Kenya.” Retrieved from

What . . . Keith Breckenridge calls “reputational collateral”: Breckenridge, K. (2018). The failure of the ‘single source of truth about Kenyans’: The NDRS, collateral mysteries and the Safaricom monopoly. African Studies, 78(1), 91–111.; Johnson, K., Pasquale, F., & Chapman, J. (2019). Artificial intelligence, machine learning, and bias in finance: Toward responsible innovation. Fordham Law Review, 88(2), 499; Demirguc-Kunt, A., Klapper, L., Singer, D., Ansar, S., & Hess, J. (2018). The Global Findex Database 2017: Measuring financial inclusion and the fintech revolution. The World Bank.

Zynga . . . gives its games permission to access . . . : Levine, S. (2019, May 10). The 5 worst apps for your privacy. Retrieved from

The illuminating example of Pokémon Go: Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs. See also Braghin, C., & Del Vecchio, M. (2017, July). Is Pokémon GO watching you? A survey on the Privacy-awareness of Location-based Apps’ users. In 2017 IEEE 41st Annual Computer Software and Applications Conference (COMPSAC) (Vol. 2, pp. 164-169). IEEE; De Souza e Silva, A. (2017). Pokémon Go as an HRG: Mobility, sociability, and surveillance in hybrid spaces. Mobile Media & Communication, 5(1), 20-23.

Google Maps functions . . . in this manner: Investopedia. (2019, November 14). How does Google Maps make money? Retrieved from

The higher-level function is to use us as free labour: Crawford, K., & Joler, V. (2018, September 7). Anatomy of an AI system: The Amazon Echo as an anatomical map of human labor, data and planetary resources. Retrieved from

In 2014, Pew Internet undertook an analysis: Atkinson, M. (2015, November 10). Apps permissions in the Google Play store. Retrieved from See also Andrews, L. (2018). A new privacy paradigm in the age of apps. Wake Forest L. Rev., 53, 421; Gentithes, M. (2020). App Permissions and the Third-Party Doctrine. Washburn LJ, 59, 35; Felt, A. P., Ha, E., Egelman, S., Haney, A., Chin, E., & Wagner, D. (2012, July). Android permissions: User attention, comprehension, and behavior. In Proceedings of the eighth symposium on usable privacy and security (pp. 1-14).

Android apps harvested location data: Ng, A. (2019, July 8). More than 1,000 Android apps harvest data even after you deny permissions. Retrieved from

CVS . . . sent a user’s GPS coordinates to over forty different third parties: Egelman, S. (2017, August 25). CVS discretely shares your location with 40+ other sites. Retrieved from

Apps targeting children: Egelman, S. (2017, July 27). We tested apps for children. Half failed to protect their data. Retrieved from The same issues apply to “smart toys” aimed at children. See Yankson, B., Iqbal, F., & Hung, P. C. (2017). Privacy preservation framework for smart connected toys. In Computing in Smart Toys (pp. 149-164). Springer, Cham.

Facebook used contact information . . . for targeted advertising: Gebhart, G. (2018, September 27). You gave Facebook your number for security. They used it for ads. Retrieved from See also Venkatadri, G., Lucherini, E., Sapiezynski, P., & Mislove, A. (2019). Investigating sources of PII used in Facebook’s targeted advertising. Proceedings on Privacy Enhancing Technologies, 2019(1), 227-244.

The People You May Know functionality: Hill, K. (2018, August 8). “People You May Know:” A controversial Facebook feature’s 10-year history. Retrieved from

As I described in a *Globe and Mail editorial*: Deibert, R. (2015, May 21). When it comes to cyberspace, should national security trump user security? Retrieved from

Parents unwittingly splash their children’s biometric data:* See Eichhorn, K. (2019). *The end of forgetting: Growing up with social media. Harvard University Press; Palfrey, J. G., & Gasser, U. (2011). Born digital: Understanding the first generation of digital natives. ReadHowYouWant. com; Gasser, U., Maclay, C. M., & Palfrey, J. G. (2010). Working towards a deeper understanding of digital safety for children and young people in developing nations. Berkman Center Research Publication, (2010-7), 10-36.

23andMe and Airbnb have partnered: Valle, G. D. (2019, May 22). Airbnb is partnering with 23andMe to send people on “heritage” vacations. Retrieved from

GlaxoSmithKline acquired: Brodwin, E. (2018, July 25). DNA-testing company 23andMe has signed a $300 million deal with a drug giant. Here’s how to delete your data if that freaks you out. Retrieved from

Those who share their genetic fingerprints: Resnick, B. (2018, October 15). How your third cousin’s Ancestry DNA test could jeopardize your privacy. Retrieved from; There are also unique risks around the use of DNA data for African Americans because of the vulnerabilities they face around racialized surveillance. See Nelson, A. (2018). The social life of DNA: Racial reconciliation and institutional morality after the genome. British Journal of Sociology, 69(3), 522–537.; Khandaker, T. (2018, July 26). Canada is using Ancestry DNA websites to help it deport people. Retrieved from; Molnar, P., & Gill, L. (2018, September). *Bots at the gate: A human rights analysis of automated decision-making in Canada’s immigration and refugee system. *Citizen Lab Research Report No. 114. Retrieved from

A security researcher discovered that . . . Zoom: Doffman, Z. (2019, July 9). Confirmed: Zoom security flaw exposes webcam hijack risk, change settings now. Retrieved from

Hundreds of millions of its users’ phone numbers: Whittaker, Z. (2019, September 4). A huge database of Facebook users’ phone numbers found online. Retrieved from

Millions of its users’ passwords were stored: Krebs, B. (2019, March 21). Facebook stored hundreds of millions of user passwords in plain text for years. Retrieved from

More than twenty million Ecuadoreans: Meredith, S. (2019, September 17). Almost everyone in Ecuador has had their personal information leaked online. Retrieved from

3,800 publicly disclosed breaches had exposed an astounding 4.1 billion individual records: Winder, D. (2019, August 20). Data breaches expose 4.1 billion records in first six months of 2019. Retrieved from

“Our privacy crisis is a crisis of design”: Warzel, C. (2019, July 9). Your inbox is spying on you. Retrieved from

Facebook did not provide a simply explained “opt-out” for users for its facial recognition scanning technology: Germain, T. (2019, September 3). Facebook updates facial recognition settings after CR investigation. Retrieved from

Shazam . . . was drawing audio from its surroundings: Leyden, J. (2016, November 15). Shhh! Shazam is always listening — even when it’s been switched ‘off.’ Retrieved from

Human contractors . . . listen in on audio recordings to transcribe what’s being said: Oremus, W. (2019, July 27). Amazon is watching. Retrieved from

Google’s audio recordings were activated without the trigger words being uttered: Hee, L. V., Baert, D., Verheyden, T., & Heuvel, R. V. D. (2019, July 10). Google employees are eavesdropping, even in your living room, VRT NWS has discovered. Retrieved from

A group of California citizens launched a class-action lawsuit: Kumandan et al. v. Google LLC et al.,* *5:19-cv-04286 (N.D. Cal. 2019).

What . . . Cory Doctorow has called “peak indifference”: Doctorow, C. (2016, July 3). Peak indifference. Retrieved from

A recent survey of American attitudes: Ladd, J. M., Tucker, J. A., & Kates, S. (2018, October 24). 2018 American Institutional Confidence Poll: The health of American democracy in an era of hyper polarization. Retrieved from

Devices have allowed tech platforms to appropriate our personal information: Cohen, J. E. (2019)*. Between truth and power: The legal constructions of informational capitalism. *Oxford University Press USA.

What media scholar Tim Wu calls the “attention merchants”: Wu, T. (2016). The attention merchants: The epic scramble to get inside our heads. Knopf.

Chapter Two: Toxic Addiction Machines [A Head]

Commonplace around any major news event: For a discussion of the meanings of disinformation, misinformation, propaganda, etc., see Jack, C. (2017). Lexicon of lies: Terms for problematic information. Data & Society Research Institute. Retrieved from; Derakhshan, H., & Wardle, C. (2017). Information disorder: definitions. AA. VV., Understanding and addressing the disinformation ecosystem, 5-12.

The WHO went so far as to label COVID-19 an “infodemic”: World Health Organization. (2020, February 2). Novel coronavirus (2019-nCoV): Situation report 13. Retrieved from See also Chen, E., Lerman, K., & Ferrara, E. (2020). Covid-19: The first public coronavirus twitter dataset. arXiv preprint arXiv:2003.07372; Abrahams, A., & Aljizawi, N. Middle East Twitter bots and the covid-19 infodemic. Retrieved from; Nightingale, S., Faddoul, M., & Farid, H. (2020). Quantifying the Reach and Belief in COVID-19 Misinformation. arXiv preprint arXiv:2006.08830.

Russian propaganda outlets spread disinformation: Breland, A. (2020, February 3). Russian media outlets are blaming the coronavirus on the United States. Retrieved from

Widely accepted throughout Chinese society: Gilbert, D. (2020, April 6). The Chinese government has convinced its citizens that the U.S. Army brought coronavirus to Wuhan. Retrieved from

One YouTube video . . . falsely claimed that Africa was immune: Shanapinda, S. (2020, April 7). No, 5G radiation doesn’t spread the coronavirus. Here’s how we know. Retrieved from

More than thirty incidents of arson and vandalism: Satariano, A., & Alba, D. (2020, April 10). Burning cell towers, out of baseless fear they spread the virus. Retrieved from

Mob violence and armed clashes with police in Ukraine: Miller, C. (2020, February 20). A viral email about coronavirus had people smashing buses and blocking hospitals. Retrieved from

In Canada, racist tropes: Do, E. M., & Quon, A. (2020, February 2). As coronavirus dominates headlines, xenophobic and insensitive social media posts go viral. Retrieved from

IBM X-Force . . . warned that more was to come: Zorz, Z. (2020, February 3). Wuhan coronavirus exploited to deliver malware, phishing, hoaxes. Retrieved from

Ransomware, digital espionage attacks, and phishing schemes: Satter, R., Stubbs, J., & Bing, C. (2020, March 23). Exclusive: Elite hackers target WHO as coronavirus cyberattacks spike. Retrieved from

Mistakenly removed links: Orr, C. (2020, March 17). Facebook is removing links to coronavirus information on government websites. Retrieved from

Platforms introduced measures to point users: Gadde, V., & Derella, M. (2020, March 16). An update on our continuity strategy during COVID-19. Retrieved from

Arrested for messages sent to an online chat group: Shih, G., & Knowles, H. (2020, February 4). A Chinese doctor was one of the first to warn about coronavirus. He got detained — and infected. Retrieved from

We discovered that YY began to censor keywords: Ruan, L., Knockel, J., and Crete-Nishihata, M.. (March 2020). “Censored Contagion: How Information on the Coronavirus is Managed on Chinese Social Media,” Citizen Lab Research Report No. 125, University of Toronto. Retrieved from

Health workers told reporters they felt overwhelmed: Fisher, M., & Taub, A. (2019, April 11). How YouTube radicalized Brazil. Retrieved from

An opportunity for climate change denialists to propagate disinformation: Ryan, H., & Wilson, C. (2020, January 22). As Australia burned, climate change denialism got a boost on Facebook. Retrieved from

Conspiracy theories circulated across social media: Knaus, C. (2020, January 11). Disinformation and lies are spreading faster than Australia’s bushfires. Retrieved from

At least one prominent politician bought into it: Capstick, S., Dyke, J., Lewandowsky, S., Pancost, R., & Steinberger, J. (2020, January 14). Disinformation on Australian bushfires should not be spread by ministers. Retrieved from

“What Australians . . . need is information and support”: Ryan & Wilson. As Australia burned.

Oxford Bibliographies defines the “public sphere”: Wessler, H., & Freudenthaler, R. (2018). Public sphere. Retrieved from

European coffee houses and salons: Habermas, J. (1991). The structural transformation of the public sphere: An inquiry into a category of bourgeois society. MIT Press; For critiques of Habermas’s notion of the public sphere, see Fraser, N. (1990). Rethinking the public sphere: A contribution to the critique of actually existing democracy. Social text, (25–26), 56–80; and Squires, C. R. (2002). Rethinking the black public sphere: An alternative vocabulary for multiple public spheres. Communication Theory, 12(4), 446–468. See also Webster, F. (2014). Theories of the information society. Routledge, chapter nine.

“The shares of America’s five biggest technology firms have been on an astonishing bull run”: *Economist. *(2020, February 21). So much for the techlash? Retrieved from

Facebook’s stock jumped close to 2 percent: Jee, C. (2019, July 15). Facebook is actually worth more thanks to news of the FTC’s $5 billion fine. Retrieved from

Facebook’s $5.7 billion investment in India’s Jio: Pham, S. (2020, May 7). India’s Jio Platforms lands $1.5 billion from Vista Equity, marking 3 big investments in 3 weeks. Retrieved from

“Digital colonialism,”as Global Voices’ Ellery Biddle calls it: Solon, O. (2017, July 27). “It’s digital colonialism”: How Facebook’s free internet service has failed its users. Retrieved from See also Sen, R., Ahmad, S., Phokeer, A., Farooq, Z. A., Qazi, I. A., Choffnes, D., & Gummadi, K. P. (2017). Inside the walled garden: Deconstructing facebook’s free basics program. ACM SIGCOMM Computer Communication Review, 47(5), 12-24.

“Infrastructural imperialism,” in which companies like Google increasingly structure our choices:* Vaidhyanathan, S. (2011). *The Googlization of everything: (And why we should worry). University of California Press.

An experiment to prove the point: Ben-Shahar, O., & Schneider, C. E. (2011). The failure of mandated disclosure. University of Pennsylvania Law Review, 159(3), 647–749. Retrieved from

“The negative impact the electronic contracting environment has on our habits and dispositions”: Frischmann, B. M., & Selinger, E. (2016). Engineering humans with contracts. Cardozo Legal Studies Research Paper No. 493.

Enables them to appropriate users’ data: Cohen. Between truth and power.

Creating a kind of “consent fatigue”: Utz, C., Degeling, M., Fahl, S., Schaub, F., & Holz, T. (2019). (Un)informed consent: Studying GDPR consent notices in the field. Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security. Retrieved from See also Choi, H., Park, J., & Jung, Y. (2018). The role of privacy fatigue in online privacy behavior. Computers in Human Behavior, 81, 42-51; Schermer, B. W., Custers, B., & van der Hof, S. (2014). The crisis of consent: How stronger legal protection may lead to weaker consent in data protection. Ethics and Information Technology, 16(2), 171-182; Custers, B. (2016). Click here to consent forever: Expiry dates for informed consent. Big Data & Society, 3(1), 2053951715624935.

A report published by the International Center for Media & the Public Agenda: University of Maryland. (2010). New study by Merrill prof finds students everywhere addicted to media. Retrieved from

Social media affect brains like falling in love: Penenberg, A. L. (2010, July 1). Social networking affects brains like falling in love. Retrieved from

Your level of oxytocin: Seiter, C. (2016, August 10). The psychology of social media: Why we like, comment, and share online. Retrieved from

People addicted to social media: Griffiths, M. D. (2013). Social networking addiction: Emerging themes and issues. Journal of Addiction Research & Therapy, 4(5).; Blackwell, D., Leaman, C., Tramposch, R., Osborne, C., & Liss, M. (2017). Extraversion, neuroticism, attachment style and fear of missing out as predictors of social media use and addiction. Personality and Individual Differences, 116, 69–72; Van Den Eijnden, R. J., Lemmens, J. S., & Valkenburg, P. M. (2016). The social media disorder scale. Computers in Human Behavior, 61, 478–487; Hawi, N. S., & Samaha, M. (2017). The relations among social media addiction, self-esteem, and life satisfaction in university students. Social Science Computer Review, 35(5), 576–586. For alternative views, see Anya Kamenetz, “The Scientific Debate Over Teens, Screens And Mental Health.” NPR. (August 27, 2019). Retrieved from; and Erin McAweeney and Mary Madden, “Beyond Tech Addiction.” Data and Society. (September 30, 2019). Retrieved from

The mere presence of a switched-off smartphone: Berthon, P., Pitt, L., & Campbell, C. (2019). Addictive de-vices: A public policy analysis of sources and solutions to digital addiction. Journal of Public Policy & Marketing, 38(4), 451–468.

Why many of us habitually pull out our phones: Berthon et al. Addictive de-vices.

Through “loot boxes” containing unknown rewards: Wiltshire, A. (2017, September 28). Behind the addictive psychology and seductive art of loot boxes. Retrieved from See also King, D. L., & Delfabbro, P. H. (2019). Video game monetization (eg,‘loot boxes’): a blueprint for practical social responsibility measures. International Journal of Mental Health and Addiction, 17(1), 166-179; Drummond, A., & Sauer, J. D. (2018). Video game loot boxes are psychologically akin to gambling. Nature Human Behaviour, 2(8), 530-532.

Effects on the release of dopamine: Dopamine. (n.d.). In Wikipedia. Retrieved May 6, 2020, from

Techniques and tools to draw you back in: Herrman, J. (2018, February 27). How tiny red dots took over your life. Retrieved from

Typical behaviour extensively studied by neuroscience: Kuss, D. J., & Griffiths, M. D. (2012). Internet and gaming addiction: A systematic literature review of neuroimaging studies. Brain sciences, 2(3), 347–374.

Designed to give users “a little dopamine hit”: Solon, O. (2017, November 9). Ex-Facebook president Sean Parker: Site made to exploit human “vulnerability.” Retrieved from

“We now know how to design cue, activity, and reward systems to more effectively leverage our brain chemistry”: Davidow, B. (2013, June 10). Skinner marketing: We’re the rats, and Facebook likes are the reward. Retrieved from; Davidow, W. (2012). Overconnected: The promise and threat of the internet. Delphinium Books; Lewis, C. (2014). Irresistible Apps: Motivational Design Patterns for Apps, Games, and Web-based Communities. Apress.

One case . . . is Snapchat: Berthon et al. Addictive de-vices.

The app’s promotion of “streaks”: Sattelberg, W. (2020, March 14). Longest Snapchat streak. Retrieved from

Referred to in social psychology as a Zeigarnik effect: Berthon et al. Addictive de-vices; Montag, C., Lachmann, B., Herrlich, M., & Zweig, K. (2019). Addictive features of social media/messenger platforms and freemium games against the background of psychological and economic theories. International Journal of Environmental Research and Public Health, 16(14), 2612; DeJong, S. M. (2014). Problematic internet use: a case of social media addiction. Adolescent Psychiatry, 4(2), 112-115; See Kringelbach, M. L., & Berridge, K. C. (2012). The joyful mind. Scientific American, 307(2), 40-45; On marketing techniques that exploits the brain’s reward system, including a discussion of dopamine, see Roese, N. J., Melo, H., Vrantsidis, T., & Cunningham, W. A. (2017). Reward system. In M. Cerf and M. Garcia-Garcia (Eds.), Consumer Neuroscience (pp. 207-221). Boston, MA: The MIT Press; Langvardt, K. (2019). Regulating Habit-Forming Technology. Fordham L. Rev., 88, 129.

Other products where addiction is a factor:* Hanson, J., & Kysar, D. (1999). Taking behavioralism seriously: Some evidence of market manipulation. *Harvard Law Review, 112(7), 1420–1572.; Buettner, R. (2017). Predicting user behavior in electronic markets based on personality-mining in large online social networks. Electronic Markets, 27(3), 247–265; Matz, S. C., Kosinski, M., Nave, G., & Stillwell, D. J. (2017). Psychological targeting as an effective approach to digital mass persuasion. Proceedings of the National Academy of Sciences, 114(48), 12714–12719. Retrieved from

“The findings of social psychology and behavioural economics are being employed to determine the news we read”: Shaw, T. (2017, April 20). Invisible manipulators of your mind. Retrieved from; See also Hirsh, J. B., Kang, S. K., & Bodenhausen, G. V. (2012). Personalized persuasion: Tailoring persuasive appeals to recipients’ personality traits. Psychological Science, 23(6), 578-581

Psychological experiments on consumers: Matz et al. Psychological targeting as an effective approach to digital mass persuasion; Alter, A. (2017). Irresistible: The rise of addictive technology and the business of keeping us hooked. Penguin; Kosinski, M., Stillwell, D. & Graepel, T. (2013). Private traits and attributes are predictable from digital records of human behavior. Proceedings of the National Academy of Sciences, 110, 5802-5805; Park, G. et al. (2015). Automatic personality assessment through social media language. Journal of Personality and Social Psychology, 108, 934-952; Youyou, W., Kosinski, M., & Stillwell, D. (2015). Computer-based personality judgments are more accurate than those made by humans. Proceedings of the National Academy of Sciences, 112, 1036-1040; Matz, S.C., Kosinski, M., Nave, G., & Stilwell, D. J. (2017). Psychological targeting as an effective approach to digital mass persuasion. Proceedings of the National Academy of Sciences, 114, 12714-12719.

Large professional conferences (like the Traffic & Conversion Summit): Traffic & Conversion Summit. (n.d.). Traffic & Conversion Summit 2020. Retrieved June 16, 2020, from

“Marketers use information about the customer to actively design more addictive offerings”: Berthon et al. Addictive de-vices.

Game developers . . . employ psychological techniques to make their products as “unquittable” as possible: Jabr, F. (2019, October 22). Can you really be addicted to video games? Retrieved from

Facebook’s admission that it had successfully modified over seven hundred thousand users’ emotions: Arthur, C. (2014, June 30). Facebook emotion study breached ethical guidelines, researchers say. Retrieved from; Flick, C. (2016). Informed consent and the Facebook emotional manipulation study. Research Ethics, 12(1), 14–28; Williams, M.L., Burnap, P., Sloan, L., Jessop, C. & Lepps, H. (2017), “Users’ Views of Ethics in Social Media Research: Informed Consent, Anonymity, and Harm”, Woodfield, K. (Ed.) The Ethics of Online Research (Advances in Research Ethics and Integrity, Vol. 2), pp. 27-52.

An experiment Facebook undertook in 2010: Zhukova, A. (2017, April 27). Facebook’s fascinating (and disturbing) history of secret experiments. Retrieved from

Facebook researchers then checked public voting records: Zhukova. Facebook’s fascinating history.

Facebook announced a breakthrough in its research into machine learning algorithms: BBC News. (2019, July 31). Facebook funds AI mind-reading experiment. Retrieved from

Neuralink . . . is reportedly developing “a high bandwidth brain-machine interface”: Wong, J. C. (2019, July 17). Elon Musk unveils plan to build mind-reading implants: ‘The monkey is out of the bag’. Retrieved from

Ryan Calo has ominously warned where these experiments might lead: Calo, R. (2014). Digital market manipulation. George Washington Law Review 995(82).; See also Calo, R., & Rosenblat, A. (2017). The taking economy: Uber, information, and power. Colum. L. Rev., 117, 1623.

Social media “instill trust by getting 2.2 billion users to forget about the platform”:* Selinger, E. (2018, June 4). Facebook fabricates trust through fake intimacy. Retrieved from; Frischmann, B., & Selinger, E. (2018). *Re-engineering humanity. Cambridge University Press.

It leverages the trust around information sharing among friends: Waldman, A. E. (2016). Privacy, sharing, and trust: The Facebook study. Case Western Reserve Law Review, 67(1). Retrieved from

Adolescent girls tend to engage in forms of indirect aggression: Debevec, T. M. (2011). A psychoanalytic inquiry into social aggression as a form of bullying among female students. Electronic Theses and Dissertations, 560. Retrieved from

Used repeatedly to shame and belittle, leading to increased depression and other mental health risks: Howard, J. (2019, January 4). Link between social media and depression stronger in teen girls than boys, study says. Retrieved from

Higher prevalence of internet addiction among adolescent males: Fumero, A., Marrero, R. J., Voltes, D., & Peñate, W. (2018). Personal and social factors involved in internet addiction among adolescents: A meta-analysis.

The World Health Organization and the American Psychiatric Association added “internet gaming disorder”: Jabr. Can you really be addicted to video games?

Higher levels of screen time . . . may be linked with increased symptoms of depression: Boers, E., Afzali, M. H., Newton, N., & Conrod, P. Association of screen time and depression in adolescence. JAMA Pediatrics, 173(9), 853–859.; However, see also Coyne, S. M., Rogers, A. A., Zurcher, J. D., Stockdale, L., & Booth, M. (2020). Does time spent using social media impact mental health?: An eight year longitudinal study. Computers in Human Behavior, 104, 106160.

“Psychological, physical, societal, and economic harms”: Berthon et al. Addictive de-vices.

Dating back at least to the time of the printing press: Levine, N. (2017). The nature of the glut: Information overload in postwar America. History of the Human Sciences, 30(1), 32–49.; Schick, A. G., Gordon, L. A., & Haka, S. (1990). Information overload: A temporal approach. Accounting, Organizations and Society, 15(3), 199–220; Toffler, A. (1984). Future shock. Bantam.

James Grier Miller . . . proposed dealing with information overload: Heterick, R. C. J. (1998). Educom: A Retrospective. Educom Review, 33(5), 42–47. Retrieved from

“Every second, on average, around 6,000 tweets are tweeted”: Twitter Usage Statistics. (n.d.). Retrieved June 16, 2020, from

On average, 1.47 billion people log onto Facebook daily: Noyes, D. (May 2020). The top 20 valuable Facebook statistics. Retrieved from

Facebook videos are viewed eight billion times per day: Constine, J. (2015, November 4). Facebook hits 8 billion daily video views, doubling from 4 billion in April. Retrieved from

Every minute, more than 3.87 million Google searches are conducted: Domo. (June 18, 2018). Data never sleeps 6.0: How much data is generated every minute? Retrieved from

“Human cognitive architecture”: Lin, H. (2019). The existential threat from cyber-enabled information warfare. *Bulletin of the Atomic Scientists *(75), 187–196; See also Matthews, J. (2019, April). A cognitive scientist explains why humans are so susceptible to fake news and misinformation. Retrieved from

Thanks to the pioneering work of Nobel Prize– winning psychologist Daniel Kahneman: Kahneman, D. (2011). Thinking, fast and slow. Macmillan; Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131; Persky, J. (1995). The ethology of homo economicus. Journal of Economic Perspectives, 9(2), 221-231.

Combined controlled laboratory experiments with systematic analysis of hundreds of online ads: Akpinar, E., & Berger, J. (2017). Valuable virality. Journal of Marketing Research, 54(2), 318–330.; Bakir, V., & McStay, A. (2017). Fake news and the economy of emotions: Problems, causes, solutions. Digital Journalism, 6(2), 154–175.; Einstein, M. (2016). Black ops advertising: Native ads, content marketing and the covert world of the digital sell. OR Books; Matz et al. Psychological targeting; Berman, M. L. (2014). Manipulative marketing and the First Amendment. Geo. LJ, 103, 497; Yeung, K. (2017). ‘Hypernudge’: Big Data as a mode of regulation by design. Information, Communication & Society, 20(1), 118-136.

Social media’s flood of content also amplifies other cognitive biases: Beasley, B. (2019, December 26). How disinformation hacks your brain. Retrieved from

“The availability heuristic” and “the illusory truth effect”: Kuran, T. (2007). Availability cascades and risk regulation. University of Chicago Public Law & Legal Theory Working Paper No. 181, 683–768. Retrieved from; Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147(12), 1865.

Many people still receive the vast majority of their information from traditional news: Nyhan, B. (2016, September 7). Relatively few Americans live in partisan media bubble, but they’re influential. Retrieved from

News organizations increasingly analyze the popularity of their stories over social media: Ferrucci, P. (2018). Networked: Social media’s impact on news production in digital newsrooms. Newspaper Research Journal, 39(1), 6–17.; Newman, N., Dutton, W., & Blank, G. (2013). Social media in the changing ecology of news: The fourth and fifth estates in Britain. International Journal of Internet Science, 7(1).

“Tweets were deemed equally newsworthy as headlines”: McGregor, S. C., & Molyneux, L. (2020). Twitter’s influence on news judgment: An experiment among journalists. Journalism, 21(5), 597–613.; See also Hermida, A. (2012). Tweets and truth: Journalism as a discipline of collaborative verification. Journalism Practice, 6(5-6), 659-668.

What one group of researchers calls “online firestorms.”: Pfeffer, J., Zorbach, T., & Carley, K. (2013). Understanding online firestorms: Negative word-of-mouth dynamics in social media networks.* Journal of Marketing Communications, 20*(1–2), 117–128.

Women, minorities, and people of colour may be particularly prone to self-censorship: Amnesty International. (2018). “Toxic Twitter — The silencing effect.” Retrieved from; Doxing. (2020, May 8). Wikipedia. Retrieved from

Which increased in the nineteenth century with advances in telecommunications: Scheuerman, W. (2001). Liberal democracy and the empire of speed. Polity, 34(1), 41–67.

“Without abiding attachments associations are too shifting”: Dewey, J. (1927).* The public and its problems.* Ohio University Press.

An explosion of social media–enabled PR: Nadler, A., Crain, M., & Donovan, J. (2018). Weaponizing the digital influence machine: The political perils of online ad tech. Data & Society Research Institute. Retrieved from

John Hill, founder of . . . Hill & Knowlton: Brandt, A. M. (2012). Inventing conflicts of interest: A history of tobacco industry tactics. American Journal of Public Health, 102(1), 63–71.; Critical Frequency (Producer). Drilled [Audio podcast]. (February 27, 2020). Season 3, episode 7. Retrieved from; Bernays, E. L. (Ed.). (1955). The engineering of consent. University of Oklahoma Press.

“Much of the classic, foundational research . . . was funded during the cold war”: Shaw, T. (2018, March 21). The new military-industrial complex of big data psy-ops. Retrieved from; Krishnan, A. (2014, November). From Psyops to Neurowar: What Are the Dangers?. In ISAC-ISSS 2014 Annual Conference; Giordano, J., & Wurzman, R. (2011). Neurotechnologies as weapons in national intelligence and defense–An overview. Synesis: A Journal of Science, Technology, Ethics, and Policy, 2(1), T55-T71; DeFranco, J., DiEuliis, D., & Giordano, J. (2019). Redefining Neuroweapons. PRISM, 8(3), 48-63; Price, D. H. (2007). Buying a piece of anthropology Part 1: Human Ecology and unwitting anthropological research for the CIA. Anthropology Today, 23(3), 8-13; Aftergood, S. (2020). Poisoner in Chief: Sidney Gottlieb and the CIA Search for Mind Control by Stephen Kinzer.* Journal of Cold War Studies, 22(1), 243-245; Pinzer, S. (2019) *Poisoner in Chief: Sidney Gottlieb and the CIA Search for Mind Control. Henry Holt & Company.

“Cyber-enabled capabilities that Hitler, Stalin, Goebbels, and McCarthy could have only imagined”:* *Lin. The existential threat.

“Since 2011, at least 27 online information operations have been partially or wholly attributed to PR or marketing firms”: Silverman, C., Lytvynenko, J., & Kung, W. (2020, January 6). Disinformation for hire: How a new breed of PR firms is selling lies online. Retrieved from; For more on “black PR” firms, see Nyst, C., & Monaco, N. (2018). State-sponsored trolling: How governments are deploying disinformation as part of broader digital harassment campaigns. Retrieved from

Actively taking advantage of the already propitious environment that social media present:* Gunitsky, S. (2015). Corrupting the cyber-commons: Social media as a tool of autocratic stability. *Perspectives on Politics, 13(1), 42–54. Retrieved from; See also Howard, P. N. (2020). Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives. Yale University Press; Fitzgerald, C. W., & Brantly, A. F. (2017). Subverting reality: the role of propaganda in 21st century intelligence. International Journal of intelligence and Counterintelligence, 30(2), 215-240. Retrieved from

“Censorship through noise”: Bloomfield, S. (2019, August 10). This Is Not Propaganda by Peter Pomerantsev review – Quietly frightening. Retrieved from; Pomerantsev, P. (2019). This is not propaganda: Adventures in the war against reality. Hachette UK.

“Professional, organized lying”: Rid, T. (2020).* Active measures: The secret history of disinformation and political warfare*. Farrar, Straus and Giroux. Rid combines his detailed historical analysis with some interesting observations about how democracies are particularly susceptible to active measures. “Disinformation operations, in essence, erode the very foundation of open societies—not only for the victim but also for the perpetrator. When vast, secretive bureaucracies engage in systematic deception, at large scale and over a long time, they will optimize their own organizational culture for this purpose, and undermine the legitimacy of public administration at home. A society’s approach to active measures is a litmus test for its republican institutions. For liberal democracies in particular, disinformation represents a double threat: being at the receiving end of active measures will undermine democratic institutions—and giving in to the temptation to design and deploy them will have the same result. It is impossible to excel at disinformation and at democracy at the same time.” (p. 11)

“Third-generation” techniques: Deibert, R., & Rohozinski, R. (2010). Control and subversion in Russian cyberspace. In R. Deibert, J. Palfrey, R. Rohozinski, & J. Zittrain (Eds.).* Access controlled: The shaping of power, rights, and rule in cyberspace* ( 15–34). MIT Press; Blank, S. (2013). Russian information warfare as domestic counterinsurgency. American Foreign Policy Interests, 35(1), 31–44.; Hulcoop, A., Scott-Railton, J., Tanchak, P., Brooks, M., & Deibert, R. (2017). Tainted leaks: Disinformation and phishing with a Russian nexus. Citizen Lab Research Report No. 92. Retrieved from

Organized criminal groups acting as proxies: Borogan, I., & Soldatov, A. (2012, April 25). The Kremlin and the hackers: Partners in crime? Retrieved from; Galeotti, M. (2016). Putin’s hydra: Inside Russia’s intelligence services. European Council on Foreign Relations; Galeotti, M. (2016). Hybrid, ambiguous, and non-linear? How new is Russia’s ‘new way of war’?. Small Wars & Insurgencies, 27(2), 282-301.

The St. Petersburg–based Internet Research Agency: For more on the IRA, see Chen, A. (2015, June 2). The agency. Retrieved from; See also Rid. Active measures. See also Malkova I., and Baev, A. (31 January 2019) “A Private Army for the President: The Tale of Evgeny Prigozhin’s Most Delicate Mission. The Bell. Retrieved from; and Grosev, C. (27 July 2020) “Russian Spying is Privatized and Competitive. Counterespionage Should Be Too.” Newsweek. Retrieved from

IRA accounts purporting to belong to Black activists: Way, L. A., & Casey, A. (2018). Russia has been meddling in foreign elections for decades. Has it made a difference? Retrieved from; Rid. Active measures; Bail, C. A., Guay, B., Maloney, E., Combs, A., Hillygus, D. S., Merhout, F., . . . & Volfovsky, A. (2020). Assessing the Russian Internet Research Agency’s impact on the political attitudes and behaviors of American Twitter users in late 2017. Proceedings of the National Academy of Sciences, 117(1), 243–250; Freelon, D., Bossetta, M., Wells, C., Lukito, J., Xia, Y., & Adams, K. (2020). Black trolls matter: Racial and ideological asymmetries in social media disinformation. Social Science Computer Review.

Take the Philippines, which is a good case study: Alba, D. (2019, March 19). Facebook removes hundreds of pages engaged in “inauthentic behavior” in the Philippines. Retrieved from; Ong, J. C., & Cabanes, J. (2018). Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines. Newton Tech4Dev Network.

The Philippines was “patient zero in the global information epidemic”: Bengali, S., & Halper, E. (2019, November 19). Troll armies, a growth industry in the Philippines, may soon be coming to an election near you. Retrieved from

“Across the Philippines, it’s a virtual free-for-all”: Mahtani, S., & Cabato, R. (2019, July 26). Why crafty internet trolls in the Philippines may be coming to a website near you. Retrieved from

In Indonesia, low-level military personnel coordinate disinformation campaigns: Allard, T., & Stubbs, J. (2020, January 7). Indonesian army wields internet ‘news’ as a weapon in Papua. Retrieved from

Taiwan is like a petri dish of disinformation: Zhong, R. (2020, January 16). Awash in disinformation before vote, Taiwan points finger at China. Retrieved from

Entire organizations, think tanks, and other front organizations: Lin. The existential threat.

*They’ve employed hackers-for-hire to target ngos. *Scott-Railton, J., Hulcoop, A., Abdul Razzak, B., Marczak, M., Anstis, S., and Deibert, R. (June 9 2020). Dark Basin: Uncovering a Massive Hack-For-Hire Operation. Citizen Lab Research Report No. 128, University of Toronto. Retrieved from

“Manufactured doubt is everywhere”: Michaels, D. (2020, January 28). Science for sale. Retrieved from

Hard-wired cognitive biases and mental shortcuts are primed to push them along: Woolley, S., & Joseff, K. (2020). Demand for deceit: How the way we think drives disinformation. International Forum Working Paper. Retrieved from

Attempts “to quash rumors through direct refutation may facilitate their diffusion”: Berinsky, A. (2017). Rumors and health care reform: Experiments in political misinformation. British Journal of Political Science, 47(2), 241–262.; Greenhill, K. M., & Oppenheim, B. (2017). Rumor has it: The adoption of unverified information in conflict zones. International Studies Quarterly, 61(3), 660–676.

Efforts to correct falsehoods can ironically contribute to their further propagation: Phillips, W. (2018). At a certain point you have to realize that you’re promoting them: The ambivalence of journalistic amplification. Data & Society Research Institute. Retrieved from; Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303–330.

Citizens becoming fatigued trying to discern objective truth: Stevenson, A. (2018, October 9). Soldiers in Facebook’s war on fake news are feeling overrun. Retrieved from

Questioning the integrity of all media can in turn lead to fatalism: MacFarquhar, N. (2016, August 29). A powerful Russian weapon: The spread of false stories. Retrieved from

“A plurality of unreality . . . encourages the listener to doubt everything”: Zuckerman, E. (2019). QAnon and the emergence of the unreal. Journal of Design and Science, (6).; Farrell, H., & Schneier, B. (2018). Common-knowledge attacks on democracy. Berkman Klein Center Research Publication 2018-7.

Social media remain polluted by misinformation and disinformation: Lewis, P. (2018). “Fiction is outperforming reality”: How YouTube’s algorithm distorts truth. Retrieved from; Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151.; Faddoul, M., Chaslot, G., & Farid, H. (2020). A Longitudinal Analysis of YouTube’s Promotion of Conspiracy Videos. arXiv preprint arXiv:2003.03318.

Researchers posing as Russian trolls were still able to buy political ads: Warzel, C. (2018, September 4). This group posed as Russian trolls and bought political ads on Google. It was easy. Retrieved from

Sheryl Sandberg made a startling admission: Vaidhyanathan, S. (2018, September 5). Why Facebook will never be free of fakes. Retrieved from

Twitter was deleting on the order of a million accounts a day: Spangler, T. (2018, July 9). Twitter stock slides on report that it has been deleting over 1 million fake accounts daily. Retrieved from

Social media’s inability to track inauthentic behaviour: Scott, M. (2018, October 7). Why we’re losing the battle against fake news. Retrieved from

Malicious actors are now using altered images and videos: Burgess, M. (2018, January 27). The law is nowhere near ready for the rise of AI-generated fake porn. Retrieved from; Chesney, B., & Citron, D. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107, 1753. Retrieved from; Farid, H. (2018). Reining in Online Abuses. Technology & Innovation, 19(3), 593-599.

In spite of the deletions, fact-checking, and monitoring systems they produce, social media will remain easy to exploit: Stewart, L. G., Arif, A., & Starbird, K. (2018, February). Examining trolls and polarization with a retweet network. In Proceedings of WSDM workshop on Misinformation and Misbehavior Mining on the Web (MIS2). Retrieved from

“Continued corruption of the information ecosphere . . . has heightened the nuclear and climate threats”: Bulletin of the Atomic Scientists. (2020, January 23). Closer than ever: It is 100 seconds to midnight. Retrieved from

H.G. Wells described an imaginary “World Encyclopedia”: Wells, H. G. (1938). *World Brain. *Methuen.

An imagined state of affairs where truth and democracy reigned supreme; this never actually existed:* Farkas, J., & Schou, J. (2019). *Post-truth, fake news and democracy: Mapping the politics of falsehood. Routledge.

Chapter Three: A Great Leap Forward . . . For the Abuse of Power

The UN’s special investigation into his execution: Callamard, A. (2019, June 19). Khashoggi killing: UN human rights expert says Saudi Arabia is responsible for “premeditated execution.” Retrieved from

One of the key figures . . . was Saud al-Qahtani: Hubbard, B. (2020, March 13). The rise and fall of M.B.S.’s digital henchman. Retrieved from; See also Hubbard, B. (2020). MBS: The Rise to Power of Mohammed bin Salman. New York: Tim Duggan Books. We also discovered the New York Times journalist Ben Hubbard was also targeted by Saudi Arabia using NSO’s Pegasus. See Marczak, M., Anstis, S., Crete-Nishihata, M., Scott-Railton, J., and Deibert. R. (January 2020). “Stopping the Press: New York Times Journalist Targeted by Saudi-linked Pegasus Spyware Operator,” Citizen Lab Research Report No. 124, University of Toronto. Retrieved from

Smaat used standard social media tactics to grow audiences and maximize its reach: DiResta, R., Grossman, S., K. H., & Miller, C. (2019). Analysis of Twitter takedown of state-backed operation attributed to Saudi Arabian digital marketing firm Smaat. Stanford Internet Observatory.

Twitter suspended eighty-eight thousand accounts connected to Smaat: Twitter Safety. (2019, December 20). New disclosures to our archive of state-backed information operations. Retrieved from

“The more victims he eats, the more he wants”: dos Santos, N., & Kaplan, M. (2018, December 4). Jamal Khashoggi’s private WhatsApp messages may offer new clues to killing. Retrieved from

While the Saudis were using Israeli-made spyware to watch dissidents abroad, we were, in turn, watching them: Marczak, B., Scott-Railton, J., McKune, S., Abdul Razzak, B., and Deibert, R. (September 2018). “Hide and Seek: Tracking NSO Group’s Pegasus Spyware to Operations in 45 Countries,” Citizen Lab Research Report No. 113, University of Toronto. Retrieved from; Marczak, B., Scott-Railton, J., Senft, A., Abdul Razzak, B., and Deibert, R. (October 2018). “The Kingdom Came to Canada: How Saudi-Linked Digital Espionage Reached Canadian Soil,” Citizen Lab Research Report No. 115, University of Toronto. Retrieved from

“God help us,” Khashoggi replied to Omar: dos Santos & Kaplan. Khashoggi’s private WhatsApp messages.

Power hierarchies are a thing of the past: For early works that took a much less sanguine view, see Naughton, J. (2001). Contested Space: The Internet and Global Civil Society, in Anheier, H., Glasius, M., and Kaldor, M. (eds.), Global Civil Society 2001, Oxford, Oxford University Press. pp. 147-168; and Deibert, R. (2003). Black Code: Censorship, Surveillance, and Militarization of Cyberspace. Millennium: Journal of International Studies, 32(3): 501-530; Deibert, R., & Rohozinski, R. (2010). Liberation vs. control: The future of cyberspace. Journal of democracy, 21(4), 43-57. See also Tufekci, Z. (2017). Twitter and tear gas: The power and fragility of networked protest. Yale University Press.

What journalist Dana Priest called “Top Secret America”: Priest, D. (2011).* Top secret America: The rise of the new American security state. Little, Brown. See also Shorrock, T. (2008). *Spies for hire: The secret world of intelligence outsourcing. Simon and Schuster.

“Our upcoming March ISS 2013 World MEA in Dubai”: Arnold, S. E. (2013, January 15). Telestrategies: An interview with Dr. Jerry Lucas. Retrieved from; See also Deibert, R. (2015). Authoritarianism goes global: Cyberspace under siege. Journal of Democracy, 26(3), 64–78; Anderson, C. (2014, July 31). Monitoring the lines: Sanctions and human rights policy considerations of TeleStrategies ISS world seminars. Retrieved from monitoring-the-lines.

Marketed a mass surveillance system, called Evident: BBC News. (2017, June 15). How BAE sold cyber-surveillance tools to Arab states. Retrieved from

More than five hundred companies now “sell a wide range of systems used to identify, track, and monitor individuals”:* *Privacy International. (2018, February 16). The global surveillance industry. Retrieved from

*Although we didn’t know it at the time, a report Citizen Lab published in 2016: *Marczak B., and Scott-Railton, J. (May 2016). “Keep Calm and (Don’t) Enable Macros: A New Threat Actor Targets UAE Dissidents,” Citizen Lab Research Report No. 75, University of Toronto. Retrieved from; Mazzetti, M., Goldman, A., Bergman R., and Perlroth, N. (March 21, 2019). “A New Age of Warfare: How Internet Mercenaries Do Battle for Authoritarian Governments,” New York Times. Retrieved from

NSO Group first came onto our radar in August 2016: Marczak B. and Scott-Railton, J. (August 2016). “The Million Dollar Dissident: NSO Group’s iPhone Zero-Days used against a UAE Human Rights Defender,” Citizen Lab Research Report No. 78, University of Toronto. Retrieved from

“Zero days” — or “open doors that the vendor does not know it should lock”: Lindsay, Restrained by design; Greenberg, A. (2012, March 23). Shopping for zero-days: A price list for hackers’ secret software exploits. Forbes; Meakins, J. (2019). A zero-sum game: The zero-day market in 2018. Journal of Cyber Policy, 4(1), 60–71; Zetter. Countdown to Zero Day.

Throughout 2017 and 2018, we partnered with Mexican human rights investigators at organizations: Scott-Railton, J., Marczak, B., Anstis, S., Abdul Razzak, B., Crete-Nishihata, B., and Deibert, R. (March 20, 2019). “Reckless VII: Wife of Journalist Slain in Cartel-Linked Killing Targeted with NSO Group’s Spyware,” Citizen Lab Research Report No. 117, University of Toronto. Retrieved from

“We would read them, and we would wonder — how do they know?”: Srivastava, M., & Wilson, T. (2019, October 30). Inside the WhatsApp hack: How an Israeli technology was used to surveil. Retrieved from

An extensive set of interviews with immigrant and refugee victims of spyware: Chisholm, B., Usiskin, C., & Whittaker-Howe, S. (n.d.). A grounded theory analysis of the psychological effects of covert digital surveillance on civil society actors [Unpublished manuscript].

For despots across time, says Montesquieu, “whatever inspires fear is the fittest spring of government”: Montesquieu, C.-L. S., Nugent, T., & Alembert, J.-B. R. (1899). The spirit of laws. Colonial Press. See also Wilkinson, R. (1972). The Broken Rebel: A Study in Culture, Politics, and Authoritarian Character. Harper & Row; Eigenberger, M. E. (1998). Fear as a correlate of authoritarianism. Psychological Reports, 83(3_suppl), 1395-1409.

“In our online meetings we don’t know if we can speak freely or not”: Michaelsen, M. (2020, February). Silencing across borders: Transnational repression and digital threats against exiled activists from Egypt, Syria, and Iran. Retrieved from See also Adamson, F. B. (2020). Non‐state authoritarianism and diaspora politics. Global Networks, 20(1), 150-169.

Says Yahya Assiri — a Saudi activist exiled to the U.K: BBC News. How BAE sold cyber-surveillance tools to Arab states; See also Amnesty International. (August 1 2018). Amnesty International Among Targets of NSO-powered Campaign. Retrieved from; and Marczak, B., Scott-Railton, J. and Deibert, R. (July 2018). “NSO Group Infrastructure Linked to Targeting of Amnesty International and Saudi Dissident,” Citizen Lab Research Report No. 110, University of Toronto. Retrieved from

The backbone is the so-called Great Firewall: See Griffiths, J. (2019). The great firewall of China: How to build and control an alternative version of the internet. Zed Books; Roberts, M. E. (2018). Censored: Distraction and diversion inside China’s great firewall. Princeton University Press; Marczak, B., Weaver, N., Dalek, J., Ensafi, R., Fifield, D., McKune, S., . . . & Paxson, V. (2015). China’s great cannon. Citizen Lab Research Report No. 52. Retrieved from; Ensafi, R., Winter, P., Mueen, A., & Crandall, J. R. (2015). Analyzing the Great Firewall of China over space and time. Proceedings on privacy enhancing technologies, 2015(1), 61-76.

Some of the Western media coverage of the social credit system has been sensationalistic: Ahmed, S. (2019, May 1). The messy truth about social credit. Retrieved from; Ahmed, S., & Weber, S. (2018). China’s long game in techno-nationalism. First Monday, 23(5). Retrieved from; Ahmed, S. “Cashless Society, Cached Data: Security Considerations for a Chinese Social Credit System.” (2017). Citizen Lab Research Report No. 87, University of Toronto. Retrieved from; and Ahmed S. and Fong. A. (2017). “Cashless Society, Cached Data: Are Mobile Payment Systems Protecting Chinese Citizens’ Data?” Citizen Lab Research Report No. 86, University of Toronto. Retrieved from

TikTok, the massively popular video streaming app: Ahmed. The messy truth.

In China, facial recognition systems have been deployed almost completely in the absence of any privacy protections: Qin, A. (2020, January 21). Chinese city uses facial recognition to shame pajama wearers. Retrieved from

SenseTime’s database had inadvertently exposed the . . . data of more than five hundred million people: Tao, L. (2019, April 12). SenseNets: The facial recognition company that supplies China’s Skynet surveillance system. Retrieved from

Authorities also require locals to install QR barcodes on the doors of their homes: Wang, M. (2018). “Eradicating ideological viruses”: China’s campaign of repression against Xinjiang’s Muslims. Retrieved from; See also Allen-Ebrahimian, B. (2019). Exposed: China’s Operating Manuals for Mass Internment and Arrest by Algorithm. International Consortium of Investigative Journalists.

Xinjiang authorities have started systematically collecting biometric data: Wang. “Eradicating ideological viruses.”; Leibold, J. (2020). Surveillance in China’s Xinjiang region: Ethnic sorting, coercion, and inducement. Journal of Contemporary China, 29(121), 46–60. Retrieved from

Reports of arrests without due process are legion: Allen-Ebrahimian, B. (2019, November 24). Exposed: China’s operating manuals fo mass internment and arrest by algorithm. Retrieved from

“China is a major driver of AI surveillance worldwide”: Feldstein, S. (2019, September). The global expansion of AI surveillance. Carnegie Endowment for International Peace. Retrieved from

An archetypal example is Brazil: Ionova, A. (2020, February 11). Brazil takes a page from China, taps facial recognition to solve crime. Retrieved from

Argentina and Ecuador have purchased Chinese surveillance technology systems: Gershgorn, D. (2020, March 4). The U.S. fears live facial recognition. In Buenos Aires, it’s a fact of life. Retrieved from; Mozur, P., Kessel, J. M., & Chan, M. (2019, April 24). Made in China, exported to the world: The surveillance state. Retrieved from

*An obscure facial recognition AI start-up called Clearview AI: *Hill, K. (2020, January 18). The secretive company that might end privacy as we know it. Retrieved from

People associated with 2,228 law enforcement agencies, companies, and institutions in twenty-seven countries had created accounts: Mac, R., Haskins, C., & McDonald, L. (2020, February 27). Clearview’s facial recognition app has been used by the Justice Department, ICE, Macy’s, Walmart, and the NBA. Retrieved from

Investors . . . even abused the app on dates and at parties: Hill, K. (2020, March 5). Before Clearview became a police tool, it was a secret plaything of the rich. Retrieved from

Ton-That had close ties to several prominent alt-right extremists: O’Brien, L. (2020, April 7). The far-right helped create the world’s most powerful facial recognition technology. Retrieved from

Nine law enforcement agencies . . . were in fact customers or employed individuals who were using the system: Allen, K., Gillis, W., & Boutilier, A. (2020, February 27). Facial recognition app Clearview AI has been used far more widely in Canada than previously known. Retrieved from

“The weaponization possibilities . . . are endless”: Hill. The secretive company.

A “superpower that we haven’t seen before in policing”: Ferguson, A. G. (2017). The rise of big data policing: Surveillance, race, and the future of law enforcement. New York University Press; See also Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press; Kerr, I., & Earle, J. (2013). Prediction, preemption, presumption: How big data threatens big picture privacy. Stan. L. Rev. Online, 66, 65; Bennett Moses, L., & Chan, J. (2018). Algorithmic prediction in policing: assumptions, evaluation, and accountability. Policing and Society, 28(7), 806-822; Robinson, D., & Koepke, L. (2016). Stuck in a Pattern. Early evidence on “predictive policing” and civil rights. Retrieved from:; Ferguson, A. G. (2016). Policing predictive policing. Wash. UL Rev., 94, 1109; See Atlas of Surveillance: Documenting Police Tech in Our Communities. A resource of the Electronic Frontier Foundation. Retrieved from

How fast, easy, and cheap it would be to identify specific individuals: Chinoy, S. (2019, April 16). We built an ‘unbelievable’ (but legal) facial recognition machine. Retrieved from

Equivalent to about eight hundred digital video cameras: Bloomberg. (2016, September 1). The surveillance firm recording crimes from Baltimore’s skies [Video]. Retrieved from

“A department could potentially purchase a fleet of 500 drones in lieu of a single police chopper”: Laperruque, J., & Janovsky, D. (2018, September 25). These police drones are watching you. Retrieved from

One vendor of drones has bragged about a 518 percent growth in use by U.S. agencies: Dronefly. (n.d.). Police drone infographic. Retrieved June 16, 2020, from

There are automatic licence plate readers: Howe, R. J. (October 2009). “Privacy impact assessment: Automatic license plate recognition (ALPR).” Royal Canadian Mounted Police (obtained through access to information request by Rob Wipond). Retrieved from; Parsons, C. (2017, June 13). Who’s watching where you’re driving? Retrieved from

Palantir technology “allows ICE agents to access a vast ‘ecosystem’ of data”: Biddle, S., & Devereaux, R. (2019, May 2). Peter Thiel’s Palantir was used to bust relatives of migrant children, new documents show. Retrieved from; MacMillan, D., & Dwoskin, E. (2019). The war inside Palantir: Data-mining firm’s ties to ICE under attack by employees. Washington Post. Retrieved from Palantir also made a push for the use of its COVID surveillance services, including in Canada. See Hemmadi, M (30 April 2020). “Palantir’s MacNaughton says data-mining firm is working with Ottawa, three provinces on COVID-19,” The Logic. Retrieved from

The trends towards data fusion have helped blur military and civilian applications: Schneier, B. (2016). Data and Goliath: The hidden battles to collect your data and control your world. W. W. Norton.

National security regulations . . . shield many of these agencies: Richardson, R., Schultz, J., & Crawford, K. (2019). Dirty data, bad predictions: How civil rights violations impact police data, predictive policing systems, and justice. Social Science Research Network Scholarly Paper No. ID 3333423.

The U.S. government began checking social media feeds for immigration vetting in 2014: Brennan Centre. (2019, June 25). Timeline of social media monitoring for vetting by the Department of Homeland Security and the State Department. Retrieved from

Huge defence contractors . . . began lobbying for legislation that would bolster border security: Lipton, E. (2013, June 6). U.S. military firms eye border security contracts. Retrieved from

“Immigration and Customs Enforcement . . . ordered $2 million worth of . . . phone and laptop hacking technology”: Brewster, T. (2017, April 13). US Immigration splurged $2.2 million on phone hacking tech just after Trump’s travel ban. Retrieved from

The company’s advertising describes . . . cloud-based private data from “over 50 of the most popular social media”: mySociety. (2020, January 29). This is what the Cellebrite software promises police. If that concerns you, why not join @privacyint’s campaign to discover whether your local force are using this surveillance software? [Tweet]. Retrieved from

The acquisition of masses of big data has pushed the agencies to find contractors who specialize in social media analytics: Brewster, T. (2017, September 27). Trump’s immigration cops just spent $3 million on these ex-DARPA social media data miners. Retrieved from

Amazon had pitched its facial recognition technology to ICE: Laperruque, J., & Peterson, A. (2018, October 23). Amazon pushes ICE to buy its face recognition surveillance tech. Retrieved from; Lutz, E. (2019, August 14). Amazon’s creepy surveillance tech can now detect fear. Retrieved from

Ring and Neighbors . . . have reflexively undertaken racial profiling: Haskins, C. (2019, February 7). Amazon’s home security company is turning everyone into cops. Retrieved from

A Fresno police department used a social media monitoring firm: Cagle, M. (2015, December 15). This surveillance software is probably spying on #BlackLivesMatter. Retrieved from; Economist. (2019, February 21). America’s cops take an interest in social media. Retrieved from

Banjo, a small AI startup: Koebler, J., Maiberg, E., & Cox, J. (2020, March 4). This small company is turning Utah into a surveillance panopticon. Retrieved from

At least seventy-five companies receive “anonymous, precise location data from apps”: Valentino-DeVries, J., Singer, V., Keller, M. H., & Krolik, A. (2018, December 10). Your apps know where you were last night, and they’re not keeping it secret. Retrieved from

Cox was able to locate a phone by paying $300 to a bounty hunter: Cox, J. (2019, January 8). I gave a bounty hunter $300. Then he located our phone. Retrieved from

The portal for one location tracking service, called LocationSmart, was improperly secured: Goodin, D. (2018, May 17). Website leaked real-time location of most US cell phones to almost anyone. Retrieved from

A spokesperson for Securus said the company “is neither a judge nor a district attorney”: Valentino-DeVries, J. (2018, May 10). Service meant to monitor inmates’ calls could track you, too. Retrieved from

Cell phone tracking laws vary widely in the U.S.: American Civil Liberties Union. (n.d.). Cell phone location tracking laws by state. Retrieved June 16, 2020, from

Canada’s RCMP use a social media monitoring system: Carney, B. (2019, March 25). “Project Wide Awake”: How the RCMP watches you on social media. Retrieved from; Craig, S. (2016, November 13). RCMP tracked 89 indigenous activists considered ‘threats’ for participating in protests. National Post; On legal voids around the use of drones, see Talai, A. B. (2014). Drones and jones: The fourth amendment and police discretion in the digital age. Calif. L. Rev., 102, 729. On “legal voids” in policing generally, see den Boer, M. (2018). Introduction to comparative policing from a legal perspective. In Comparative Policing from a Legal Perspective. Edward Elgar Publishing; See also Robertson, R., Khoo, C., and Song, Y. “To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada” (September 2020), Citizen Lab and International Human Rights Program, University of Toronto. Retrieved from

A *Toronto Star investigation, which analyzed RCMP logs: Allen, K. (2019, April 8). What you should know about the ‘Stingray’ surveillance device used by police. Retrieved from; Israel, T., & Parsons, C. (2016). *Gone opaque? An analysis of hypothetical IMSI catcher overuse in canada. Telecon Transparency Project, Citizen Lab. Retrieved from

Seventy-five federal, state, and municipal agencies in the U.S. that used cell site simulators: American Civil Liberties Union. (2018, November). Stingray tracking devices: Who’s got them? Retrieved from

Local police in Maryland had been using cell site simulators for over a decade before publicly disclosing them: Mabeus, C. (2016, May 3). Battlefield technology gets spotlight in Maryland courts: Secrecy and defense concerns surround cell phone trackers. Retrieved from

Some of the measures many countries adopted or proposed were deeply unsettling: Capatides, C. (2020, April 2). “Shoot them dead”: Philippine president Rodrigo Duterte orders police and military to kill citizens who defy coronavirus lockdown. Retrieved from; Gebrekidan, S. (2020, March 30). For autocrats, and others, coronavirus is a chance to grab even more power. Retrieved from; Gershgorn, D. (2020, April 9). We mapped how the coronavirus is driving new surveillance programs around the world. Retrieved from; See also Kamradt-Scott, A., & McInnes, C. (2012). The securitisation of pandemic influenza: framing, security and public policy. Global Public Health, 7(sup2), S95-S110.

Drones were being offered up and used as part of COVID mitigation efforts: Gaulkin, T. (2020, April 1). Drone pandemic: Will coronavirus invite the world to meet Big Brother? Retrieved from

How easy it is to unmask real identities contained in large personal data sets: Narayanan, A., & Shmatikov, V. (2008). Robust de-anonymization of large sparse datasets. IEEE Symposium on Security and Privacy, 111–125.

“At least eight surveillance and cyber-intelligence companies attempting to sell repurposed spy and law enforcement tools”: Schectman, J., Bing, C., & Stubbs, J. (2020, April 28). Cyber-intel firms pitch governments on spy tools to trace coronavirus. Retrieved from

“The world’s business has slid into a world of personal devices”: Scott-Railton, J. (2020, March 23). Another critical COVID-19 shortage: Digital security. Retrieved from

Zoom had been plagued by security issues for a number of years: Wells, D. (2018, December 3). Remotely hijacking Zoom clients. Retrieved from; Cox, J. (2020, March 26). Zoom iOS app sends data to Facebook even if you don’t have a Facebook account. Retrieved from

Highly disturbing instances of “Zoom-bombing”: Setera, K. (2020, March 30). FBI warns of teleconferencing and online classroom hijacking during COVID-19 pandemic. Retrieved from

Our Citizen Lab team reverse-engineered Zoom: Marczak, B., & Scott-Railton, J. (2020, April 3). Move fast and roll your own crypto: A quick look at the confidentiality of Zoom meetings. Retrieved from

What I have elsewhere referred to as “event-based” information controls: Deibert, R., & Rohozinski, R. (2008). Good for liberty, bad for security? Internet securitization and global civil society. In R. Deibert, J. Palfrey, R. Rohozinski, & J. Zittrain (Eds.). Access denied: The practice and policy of internet filtering (123–165). MIT Press; Bennett, C., & Haggerty, K. (Eds.). (2011). Security games: Surveillance and control at mega-events. Routledge; Whelan, C. and Molnar, A. 2018. Securing Mega-Events: Strategies, Organisation and Legacies. Crime Prevention and Security Management Series, New York: Palgrave-Macmillan; and Molnar, A., 2015. The geo‐historical legacies of urban security governance and the Vancouver 2010 Olympics. The Geographical Journal, 181(3), pp.235-241.

“Crises are a time-tested means of subverting democracy”: Levitsky, S., & Ziblatt, D. (2019, January 12). Why autocrats love emergencies. Retrieved from; Shammas, M. (2019, December 12). What’s behind rising authoritarianism: Answers from political psychology & the Third Reich. Retrieved from

“If you’ve got nothing to hide, you’ve got nothing to fear”: For a detailed examination of the flaws concerning the arguments around “nothing to hide,” see Solove, D. J. (2011). Nothing to hide: The false tradeoff between privacy and security. Yale University Press.

Abuse-of-power episodes within ostensibly liberal democratic societies: Weiner, T. (2012). Enemies: A history of the FBI. Random House; Ross, C. A. (2007). Ethics of CIA and military contracting by psychiatrists and psychologists. Ethical Human Psychology and Psychiatry, 9(1), 25–34; Hewitt, S. (2018). Cold war counter-terrorism: The evolution of international counter-terrorism in the RCMP Security Service, 1972–1984. Intelligence and National Security, 33(1), 67–83; In Canadian criminal law, there are numerous examples of recognized violations of constitutionally protected rights of individuals by law enforcement agencies (see, e.g., R v. Grant, 2009 SCC 32; R v. Le, 2019 SCC 34; R v. Evans, 1996 1 SCR 8; R v. Nasogaluak, 2010 SCC 6; R v. Cole, 2012 SCC 53); Savage, C., & Risen, J. (2010, March 31). Federal judge finds NSA wiretaps were illegal. New York Times; Greenwald, G. (2014). No place to hide: Edward Snowden, the NSA, and the US surveillance state. Picador.

A large and influential class of kleptocrats: Cooley, A., Heathershaw, J., & Sharman, J. (2018). The rise of kleptocracy: Laundering cash, whitewashing reputations. Journal of Democracy, 29(1), 39–53.; See also Cooley, A. and Heathershaw, J. (2017). Dictators Without Borders: Power and Money in Central Asia. New Haven: Yale University Press.

“The 13th consecutive year of decline in global freedom”: Freedom House. (2019). Freedom in the world 2019: Democracy in retreat. Retrieved from

“The political uses of the internet in autocracies and democracies are becoming harder to distinguish” *Gunitsky, S. (2020, February 19). The great online convergence: Digital authoritarianism comes to democracies. Retrieved from See also Glasius, M. (2018). What authoritarianism is… and is not: a practice perspective. *International Affairs, 94(3), 515-533 for a discussion of how illiberal, authoritarian practices transcend territorial boundaries; and Glasius, M. (2018). Extraterritorial authoritarian practices: a framework. Globalizations, 15(2), 179-197.

Chapter Four: Burning Data

Jio, launched in 2016 by India’s richest man: Pham, S. (2020, May 7). India’s Jio Platforms lands $1.5 billion from Vista Equity, marking 3 big investments in 3 weeks. Retrieved from

Under the reign of prime minister Narendra Modi . . . the country has rapidly descended into authoritarianism: Filkins, D. (December 2, 2019). Blood and soil in Narendra Modi’s India. Retrieved from

Our very first report on cyber-espionage: Citizen Lab. (2009, March 28). Tracking GhostNet: Investigating a cyber espionage network. Retrieved from; Deibert, R. (2013). Black code: Surveillance, privacy, and the dark side of the Internet. McClelland & Stewart Limited; See also See Marczak, B., Hulcoop, A., Maynier, E., Abdul Razzak, B., Crete-Nishihata, M., Scott-Railton, J., and Deibert, R. (2019). “Missing Link: Tibetan Groups Targeted with 1-Click Mobile Exploits,” Citizen Lab Research Report No. 123, University of Toronto. Retrieved from

Delhi has “turned into a gas chamber”: Arvind Kejriwal. (2019, November 1). Delhi has turned into a gas chamber due to smoke from crop burning in neighbouring states. It is very imp that we protect ourselves from this toxic air. Through pvt & govt schools, we have started distributing 50 lakh masks today. I urge all Delhiites to use them whenever needed. [Tweet]. Retrieved from

Coal-fired power plants accounted for 44 percent of new energy production in India in 2019: Saurabh. (2020, January 20). Coal makes a comeback in India, new capacity up 73% in 2019. Retrieved from

Only one of them is complying with a law requiring the installation of equipment to cut emissions of sulphur oxides: Varadhan, S. (2020, February 5). India’s pollution regulator threatens to shut 14 coal-fired power plants. Retrieved from

Warns Apple, “Unauthorized modification of iOS can cause security vulnerabilities”: Apple. (n.d.). Unauthorized modification of iOS can cause security vulnerabilities, instability, shortened battery life, and other issues. Retrieved December 2019 from

Extraordinary steps to discourage users from getting too curious about what goes on “beneath the hood”: Gordon, W. (2019, April 17). The most common ways manufacturers prevent you from repairing your devices. Retrieved from

The production of each and every device involves hundreds of kilograms of fossil fuels: UN News. (2004, March 8). Computer manufacturing soaks up fossil fuels, UN University study says. Retrieved from; See also Ensmenger, N. (2013). Computation, materiality, and the global environment. IEEE Annals of the History of Computing, 35(3), 80-80.

Data centres . . . consume hundreds of thousands of gallons of fresh water a day: Ensmenger, N. (2018). The environmental history of computing. Technology and Culture, 59(4), S7–S33.

Non-renewable resources, manufacturing, shipping, energy, labour, and non-recyclable waste: Gies, E. (2017, November 29). The real cost of energy. Retrieved from

Around seventy of the eighty-three stable and non-radioactive elements in the entire periodic table: Nield, D. (2015, August 4). Our smartphone addiction is costing the Earth. Retrieved from

China holds the world’s largest reserves of rare earth elements: Hearty, G. (2019, August 20). Rare earths: Next element in the trade war? Retrieved from

China did just that, shutting off exports of the elements to Japan for two months: Funabashi, Y. (2019, August 9). The Mideast has oil, China has rare earths. Retrieved from

The mining and refining activities consume vast amounts of water while generating a large quantity of CO2 emissions: Crawford & Joler. Anatomy of an AI system.

Rare earth elements are mined either by stripping away layers of topsoil or by drilling holes into the ground: Standaert, M. (2019, July 2). China wrestles with the toxic aftermath of rare earth mining. Retrieved from

“Only 0.2 percent of the mined clay contains the valuable rare earth elements”: Abraham, D. S. (2015). The elements of power: Gadgets, guns, and the struggle for a sustainable future in the rare metal age. Yale University Press.

The element cerium (which is used to polish the glass on our device screens): Maughan, T. (2015, April 2). The dystopian lake filled by the world’s tech lust. Retrieved from

For every one tonne of rare earth elements mined and processed: Crawford & Joler. Anatomy of an AI system.

High levels of contaminants in the region’s ground and surface water: Standaert. China wrestles with the toxic aftermath.

Grotesque deformities in local livestock: Kaiman, J. (2014, March 20). Rare earth mining in China: The bleak social and environmental costs. Retrieved from

“Damaged crops, homes and belongings covered in soot, polluted drinking water”: Whoriskey, P. (2020, October 2). China pollution caused by graphite mining for smartphone battery. Retrieved from

Bayan Obo is the closest thing to “hell on earth”: Maughan. The dystopian lake; Liu, H. (2016, June). Rare earths: Shades of grey: Can China continue to fuel our global clean & smart future? Retrieved from

The mine “has put a death-curse on nearby villages,” and the giant waste pond . . . is “a time bomb”: Liu. *Rare earths.

Satellite images show dozens of them spread throughout the region’s hills and mountains: Standaert. China wrestles with the toxic aftermath.

There is also a black market for rare earth element mining: Liu. *Rare earths.

About forty thousand tonnes of rare earth metals were smuggled out of China each year: Stanway, D. (2015, July 7). Fate of global rare earth miners rests on China smuggling crackdown. Retrieved from

Lynas Corporation exports its rare earth metal processing: Liu. *Rare earths.

The company built the largest refining facility in the world in Malaysia: Bradsher, K. (2011, June 30). Engineers fear rare earth refinery in Malaysia is dangerous. Retrieved from

Around 580,000 tonnes of low-level radioactive waste: Lipson, D., & Hemingway, P. (2019, August 21). Australian mining company Lynas gets permission to dispose of radioactive waste in Malaysia, dividing locals. Retrieved from

Take lithium, known as “grey gold”: Crawford & Joler. Anatomy of an AI system.

Lithium production is booming: Shankleman, J., Biesheuvel, T., Ryan, J., & Merrill, D. (2017, September 7). We’re going to need more lithium. Retrieved from

A single Tesla car requires about seven kilograms of lithium for each of its battery packs: Crawford & Joler. Anatomy of an AI System.

“What links the battery in your smartphone with a dead yak floating down a Tibetan river”: Katwala, A. (2018, August 5). The spiralling environmental cost of our lithium battery addiction. Retrieved from

Lithium is found in the brine of salt flats: Zacune, J. (n.d.). Lithium. Retrieved June 16, 2020, from

The lithium carbonate is then extracted through a chemical process that . . . can harm nearby communities: Karlis, N. (2019, June 17). Electric cars are still better for the environment. But lithium mining has some problems. Retrieved from

In Chile’s Atacama and Argentina’s Salar de Hombre Muerto regions: Zacune. Lithium.

More than half of the world’s cobalt supply is sourced from the Democratic Republic of Congo: U.S. Department of the Interior. (n.d.). Cobalt statistics and information. Retrieved June 16, 2020, from; Eichstaedt, P. (2011). Consuming the Congo: War and conflict minerals in the world’s deadliest place. Chicago Review Press.

Cobalt mining operations in the DRC routinely use child labour: Amnesty International. (2016). “This is what we die for”: Human rights abuses in the Democratic Republic of the Congo power the global trade in cobalt. Retrieved from

Health officials have linked breathing problems and birth defects: Frankel, T. C. (2016, September 30). Cobalt mining for lithium ion batteries has a high human cost. Retrieved from

“Urinary concentrations of cobalt that were 43 times as high as that of a control group”: Frankel. Cobalt mining.

Around four thousand children worked at mining sites in the southern DRC city of Kolwezi alone: Frankel. Cobalt mining.

“A child working in a mine in the Congo would need more than 700,000 years of non-stop work”: Crawford & Joler. Anatomy of an AI system.

Indonesia’s tin mining operations are “an orgy of unregulated mining”: Ensmenger. The environmental history of computing.

Most of the tin is sourced by . . . PT Timah: Crawford & Joler. Anatomy of an AI system.

“If you own a mobile, it’s probably held together by tin from the Indonesian island of Bangka”: Hodal, K. (2012, November 23). Death metal: Tin mining in Indonesia. Retrieved from

“A complex structure of supply chains within supply chains”: Crawford & Joler. Anatomy of an AI system.

The company undertook site visits to eighty-five smelters and refiners in twenty-one countries: Intel. (2014, May). Intel’s efforts to achieve a “conflict free” supply chain. Retrieved from

One container ship can produce the same amount of pollution as about fifty million cars: Piesing, M. (2018, January 4). Cargo ships are the world’s worst polluters, so how can they be made to go green? Retrieved from

Foxconn and another manufacturer, Unimicron, were dumping heavy metals: Myslewski, R. (2013, August 5). Chinese Apple suppliers face toxic heavy metal water pollution charges. Retrieved from

Working conditions at the Catcher Technology Company’s factory in China: Bloomberg News. (2018, January 16). Apple supplier workers describe noxious hazards at China factory. Retrieved from

Factory employees reported they were “exposed to toxic chemicals every day”: China Labor Watch. (2019, September 8). iPhone 11 illegally produced in China: Apple allows supplier factory Foxconn to violate labor laws. Retrieved from

Foxconn’s Longhua factory is notorious for its suicide nets: Merchant, B. (2017, June 18). Life and death in Apple’s forbidden city. Retrieved from; See also Merchant, B. (2017). The one device: The secret history of the iPhone. Little, Brown.

In response to reports of worker suicides, Steve Jobs promised to take action: Fullerton, J. (2018, January 7). Suicide at Chinese iPhone factory reignites concern over working conditions. Retrieved from

Eight years later . . . working conditions had not improved: China Labor Watch. iPhone 11 illegally produced in China.

Foxconn and Apple disputed the majority of these allegations: Toh, M. (2019, September 9). Apple says a supplier’s factory in China violated labor rules. Retrieved from

A ten-by-forty-mile strip of land around Santa Clara County, California: Ensmenger. The environmental history of computing; United States Environmental Protection Agency. (n.d.). What is Superfund? Retrieved from

Fluorinated greenhouse gases . . . have “extremely high global warming potentials”: United States Environmental Protection Agency. (2018, April). Center for Corporate Climate Leadership sector spotlight: Electronics. Retrieved from

Together with mining, manufacturing processes account for about 95 percent of waste: Lepawsky, J. (2018, January 19). ‘Wasted’: Why recycling isn’t enough when it comes to e-waste. Retrieved from

“No amount of post-consumer recycling can recoup the waste”: Lepawsky, J. (2018, May 17). Almost everything we know about e-waste is wrong. Retrieved from

The internet appeared to be holding up, in spite of usage surging: Beech, M. (2020, March 25). COVID-19 pushes up internet use 70% and streaming more than 12%, first figures reveal. Retrieved from

Air traffic, automobile, and other forms of fossil-fuelled transportation plummeted: Henriques, M. (2020, March 27). Will Covid-19 have a lasting impact on the environment? Retrieved from

Americans waste up to $19 billion annually in electricity costs: University of Utah. (2016, October 25). A complete waste of energy: Engineers develop process for electronic devices that stops wasteful power leakage. Retrieved from

The world’s communication ecosystem currently consumes approximately 7 percent of global electricity: Jones, N. (2018, September 12). How to stop data centres from gobbling up the world’s electricity. Retrieved from

A smartphone streaming an hour of video on a weekly basis uses more power annually than a new refrigerator: Burrington, I. (2016, December 16). The environmental toll of a Netflix binge. Retrieved from

Sending sixty-five emails is roughly equivalent to driving one kilometre in a car: Villazon, L. (2020, January 3). The thought experiment: What is the carbon footprint of an email? Retrieved from

A major study by a team of researchers at Canada’s McMaster University: Belkhir, L., & Elmeligi, A. (2018). Assessing ICT global emissions footprint: Trends to 2040 & recommendations. Journal of Cleaner Production, 177, 448–463.

“From Bitcoin ‘mines’ to server ‘farms’ to data ‘warehouses’”: Ensmenger. The environmental history of computing.

Central Asian countries . . . advertise for Bitcoin mining operations to be hosted in their jurisdictions: Redman, J. (2020, February 12). 3 cents per kWh — Central Asia’s cheap electricity entices Chinese bitcoin miners. Retrieved from

Estimates put electric energy consumption associated with Bitcoin mining at around 83.67 terawatt-hours per year: Digiconomist. (n.d.). Bitcoin energy consumption index. Retrieved May 27, 2020, from; De Vries, A. (2018). Bitcoin’s growing energy problem. Joule, 2(5), 801-805; Truby, J. (2018). Decarbonizing Bitcoin: Law and policy choices for reducing the energy consumption of Blockchain technologies and digital currencies. Energy research & social science, 44, 399-410.

The electricity consumed by the Bitcoin network in one year could power all the teakettles used to boil water in the entire United Kingdom for nineteen years: Cambridge Centre for Alternative Finance. (n.d.). Cambridge Bitcoin electricity consumption index. Retrieved from

A life-cycle assessment for training several common large AI models: Hao, K. (2019, June 6). Training a single AI model can emit as much carbon as five cars in their lifetimes. Retrieved from; Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for deep learning in NLP.* *Retrieved from

Data centres are “hidden monuments” to our excessive data consumption: Hogan, M. (2015). Facebook data storage centers as the archive’s underbelly. Television & New Media, 16(1), 3–18.; See also Hogan, M. (2015). Data flows and water woes: The utah data center. Big Data & Society, 2(2), 2053951715592429; Hogan, M., & Shepherd, T. (2015). Information ownership and materiality in an age of big data surveillance. Journal of Information Policy, 5, 6-31. For a political economy analysis of cloud computing, as well as a history of the use of the metaphor and what that implies culturally, see the seminal Mosco, V. (2015). To the cloud: Big data in a turbulent world. Routledge.

What one author appropriately called “energy hogs”: Pearce, F. (2018, April 3). Energy hogs: Can world’s huge data centers be made more efficient? Retrieved

Energy consumption by data centres will treble in the next decade: Bawden, T. (2016, January 23). Global warming: Data centres to consume three times as much energy in next decade, experts warn. Retrieved from

As much as 70 percent of the entire world’s internet traffic passes through data centres housed in a single county in Virginia: Upstack. (n.d.). Why is Ashburn known as Data Center Alley? Retrieved May 29, 2020, from

The electricity demand of data centre alley is estimated to be about 4.5 gigawatts: Craighill, C. (2019, February 13). Greenpeace finds Amazon breaking commitment to power cloud with 100% renewable energy. Retrieved from

As much as 360,000 gallons of clean, chilled water a day: Ensmenger. The environmental history of computing.

Roughly as much water as about one hundred acres of almond trees: FitzGerald, D. (2015, June 24). Data centers and hidden water use. Retrieved from

U.S. server farms will have used 174 billion gallons of water by 2020: Shehabi, A., Smith, S., Sartor, D., Brown, R., Herrlin, M., Koomey, J. . . . & Lintner, W. (2016). *United States Data Center energy usage report. *Lawrence Berkeley National Lab. Retrieved from

Mesa, Arizona, made a deal with Google to permit construction of a massive server farm: Mesa Council, Board, and Committee Research Center. (2019, July 1). Resolution 19-0809. Retrieved from

Google . . . considers its water use a proprietary trade secret: Sattiraju, N. (2020, April 1). Google data centers’ secret cost: Billions of gallons of water. Retrieved from

An extensive ranking of social media platforms: Cook, G. (2017). Clicking clean: Who is winning the race to build a green internet? Retrieved from

Among the worst is Amazon: Hern, A. (2019, April 9). Amazon accused of abandoning 100% renewable energy goal. Retrieved from

AWS alone brings in more revenue than McDonald’s: Merchant, B. (2019, April 8). Amazon is aggressively pursuing big oil as it stalls out on clean energy. Retrieved from

Amazon also runs one of the largest warehouse, transportation, distribution, and logistical operations in the world: Ensmenger. The environmental history of computing; McCarthy, L. (2020, January 30). Amazon: accelerating decline in shipping costs are driving future valuation. Retrieved from

Amazon owns 850 facilities in twenty-two countries: Bearth, D. (2019, April 8). Is Amazon a logistics company? All signs point to that. Retrieved from

It is also almost completely non-transparent about its energy footprint: Crawford & Joler. Anatomy of an AI system.

“The world’s largest cloud computing company is . . . still powering its corner of the internet with dirty energy”: Craighill, C. (2019, February 13). Greenpeace finds Amazon breaking commitment to power cloud with 100% renewable energy. Retrieved from

It’s a good bet that both Alibaba and Tencent: Pearce. Energy hogs.

Many cloud computing companies are actually seeking out revenues from fossil fuel industries: Merchant, B. (2019, February 21). How Google, Microsoft, and big tech are automating the climate crisis. Retrieved from

Chevron, ExxonMobile, Total, and Equinor have signed billion-dollar contracts with Google, Microsoft, and others: Matthews, C. M. (2018, July 24). Silicon Valley to big oil: We can manage your data better than you. Retrieved from

Amazon . . . has reportedly “aggressively courted” the fossil fuel sector: Merchant. Amazon is aggressively pursuing big oil.

India is currently the world’s second-largest smartphone market: Bhattacharya, A. (2017, December 21). There’s an e-waste crisis lurking behind India’s cheap-phone boom. Retrieved from

Indian authorities have introduced laws and measures to try to standardize the industry: Lahiry, S. (2019, April 17). Recycling of e-waste in India and its potential. Retrieved from

Most of India’s recycling and processing . . . is still managed by the informal sector: Kumar, R., & Shah, D. J. (2014). Review: Current status of recycling of waste printed circuit boards in India.* Journal of Environmental Protection, 5*(1), 9–16.

An estimated one million people . . . depend for their livelihood on these manual recycling operations: Bhattacharya. There’s an e-waste crisis lurking.

A sizable proportion of e-waste trade is actually highly regionalized: Lepawsky, J. (2015), The changing geography of global trade in electronic discards.* Geographical Journal, 181(2), 147–159.; Lepawsky, J., & McNabb, C. (2010). Mapping international flows of electronic waste. *Canadian Geographer / Le Géographe canadien, 54(2), 177–195.

“There is definitely a topography to the e-waste trade”: Lepawsky, J. (2016, March 10). Trading on distortion. Retrieved from

While India’s reuse economy is truly remarkable, that doesn’t mean that there are no waste or other issues: Corwin, J. E. (2018). “Nothing is useless in nature”: Delhi’s repair economies and value-creation in an electronics “waste” sector. Environment and Planning A: Economy and Space, 50(1), 14–30.; See also Toxics Link. (2019). Informal e-waste recycling in Delhi. Retrieved from

Raw sewage combined with acid wash . . . flows directly into the Yamuna River: Bhaduri, A. (2017, November 30). Why does the world’s e-waste reach India? Retrieved from

Apple churns out a bewildering variety of new components and accessories: Leber, R. (2020, March 3). Your plastic addiction is bankrolling big oil. Retrieved from

Take AirPods: Haskins, C. (2019, May 6). AirPods are a tragedy. Retrieved from

Even highly efficient recycling is an energy-intensive industrial process: Pickering, D. (Producer). (2018, February 7). Restart [Audio podcast]. Episode 9. Tracing global flows of electronic ‘discards’ with Josh Lepawsky. Retrieved from

Adding extra life to computers saves between five and twenty times more energy than recycling them outright: Ives, M. (2014, February 6). In developing world, a push to bring e-waste out of shadows. Retrieved from

Rising sea levels risk flooding critical infrastructure located in coastal areas: Borunda, A. (2018, July 16). The internet is drowning. Retrieved from

The risk of high temperatures overheating data centres: Bogle, A. (2015, January 14). Will climate change burn up the internet? Retrieved from

“For the most part, we experience only the positive benefits of information technology”: Ensmenger. The environmental history of computing. For an extended treatment of our views from the “inside” versus the energy “needs” of the technosphere as a whole, see Haff, P. K. (2014). Technology as a geological phenomenon: Implications for human well-being. Geological Society, London, Special Publications, 395(1), 301-309.

Chapter Five: Retreat, Reform, Restraint

“We were both playing Columbo to each other”: Contenta, S. (2019, December 13). How these Toronto sleuths are exposing the world’s digital spies while risking their own lives. Retrieved from

The ensuing AP story was a bombshell: Satter, R. (2019, February 11). Undercover spy exposed in NYC was 1 of many. Retrieved from

“Our very tools and techniques threaten to wipe us out”: Vaidhyanathan. Antisocial media.

Companies trumpet their powerful machine learning and artificial intelligence systems: Klein, N. (2020, May 13). How big tech plans to profit from the pandemic. Retrieved from

Like all of the existential crises that surround us, it may be part of human nature: Sterman, J. D. (2008). Risk communication on climate: mental models and mass balance. Science, 322(5901), 532-533.

The luxurious Grand Velas Riviera Nayarit: Grand Velas Riviera Maya. (2019, January 4). Velas Resorts introduces the ‘detox concierge’ as part of 2019 digital detox program. Retrieved from

Guidelines for a thirty-day “digital declutter” process: Newport, C. (2019). Digital minimalism: Choosing a focused life in a noisy world. *Portfolio; See also Allcott, H., Braghieri, L., Eichmeyer, S., & Gentzkow, M. (2020). The welfare effects of social media. *American Economic Review, 110(3), 629–76; Hill, K. (2019). I cut the ‘Big Five’ tech giants from my life. It was hell. Retrieved from

These don’t go far enough for virtual reality pioneer Jaron Lanier: Lanier, J. (2018).* Ten arguments for deleting your social media accounts right now.* Henry Holt.

Bill McKibben’s *The Age of Missing Information . . . or Lewis Mumford’s The Pentagon of Power: McKibben, B. (2006). *The age of missing information. Random House; Mumford, L. (1970). The pentagon of power: The myth of the machine. *Vol. 2.* Harcourt Brace Jovanovich.

We now live in a “global village” (to borrow McLuhan’s phrasing): McLuhan, M. (1962). The Gutenberg galaxy: The making of typographic man. University of Toronto Press.

A single habitat that . . . Buckminster Fuller once aptly called “Spaceship Earth”: Fuller, R. B. (1969). Operating manual for spaceship Earth. Southern Illinois University Press.

Facebook refers cases to the board, whose decisions are binding but not enforced by law: Douek, E. (2020, May 11). “What kind of oversight board have you given us?” Retrieved from; See also Klonick, K. (2019). The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression. Yale LJ, 129, 2418.

There are as many as 188 fact-checking entities in more than sixty countries: Woolley & Joseff. *Demand for deceit.

Advocates of fact-checking also assume that everyone reasons the same way: Woolley & Joseff. *Demand for deceit.

Fact-checking can actually reinforce the spread of false information: Pennycook, G., & Rand, D. (2020, March 24). The right way to fight fake news. Retrieved from; Pennycook, G., Bear, A., Collins, E. T., & Rand, D. G. (2020). The implied truth effect: Attaching warnings to a subset of fake news headlines increases perceived accuracy of headlines without warnings. Management Science [Forthcoming].

General warnings about the veracity of news can actually reduce confidence in all news sources: Clayton, K., Blair, S., Busam, J. A., Forstner, S., Glance, J., Green, G., … & Sandhu, M. (2019). Real solutions for fake news? Measuring the effectiveness of general warnings and fact-check tags in reducing belief in false stories on social media. Political Behavior, 1–23.

Media literacy is incomplete: See Boyd, D. (2017). “Did media literacy backfire?” Journal of Applied Youth Studies, 1(4), 83–89. Retrieved from;dn=607936397466888;res=IELNZC; Bulger, M., & Davison, P. (2018). The promises, challenges, and futures of media literacy. Data & Society Research Institute. Retrieved from

Oops! It was a “bug,” said Guy Rosen: Associated Press. (2020, March 17). Facebook bug wrongly deleted authentic coronavirus news. Retrieved from; Human social media content moderators have extremely stressful jobs, given the volume of potentially offensive and harmful posts: Roberts, S. T. (2019). Behind the screen: Content moderation in the shadows of social media. Yale University Press; Kaye, D. A. (2019). Speech police: The global struggle to govern the internet. Columbia Global Reports; Jeong, S. (2015). The internet of garbage. Forbes Media..

There is no shortage of proposals to reform and regulate social media: Owen, T. (2019, November). The case for platform governance. Centre for International Governance Innovation.

Some advocate for giving users a legal right “not to be tracked”: United States of America. (2019). Do Not Track Act, S.1578, 116th Cong. Retrieved from

Calls for more scrutiny of the machine-based algorithms companies use to sort their users: United States of America. (2019). Algorithmic Accountability Act of 2019, H.R.2231, 116th Cong. Retrieved from; Raji, I. D., Gebru, T., Mitchell, M., Buolamwini, J., Lee, J., & Denton, E. (2020). Saving face: Investigating the ethical concerns of facial recognition auditing. Proceedings of the 2020 AAAI/ACM Conference on AI, Ethics, and Society (AIES ’20).; Pasquale, F. (2015). The black box society. Harvard University Press; Ananny, M., & Crawford, K. (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. new media & society, 20(3), 973-989.

Proposals have been made to legislate greater transparency in the social media advertising space: Friedersdorf, C. (2019, November 1). Doubt anyone who’s confident that Facebook should ban political ads. Retrieved from

Many advocate for social media platforms to be forced (or otherwise encouraged) to incorporate human rights due diligence: Amnesty International. (2019). Surveillance giants: How the business model of Google and Facebook threatens human rights; Kaye. Speech police.

The development of “civic media” as a “social” or “public” (instead of commercial) “good”: Owen. *The case for platform governance.

Everyone’s efforts are weaving something larger than their own separate struggles: Deudney, D., & Mendenhall, E. (2016). Green Earth: The emergence of planetary civilization. In S. Nicholson & S. Jinnah (Eds.), New Earth politics: Essays from the Anthropocene (43–72). MIT Press.

The social media equivalent of Rachel Carson’s *Silent Spring, Barry Commoner’s The Closing Circle, and Paul Ehrlich’s The Population Bomb: Carson, R. (1962). *Silent spring. *Houghton Mifflin; Commoner, B. (1971). The closing circle: Nature, man, and technology.* Alfred A. Knopf; Ehrlich, P. R. (1971).* The population bomb.* Ballantine.

Republican political theory enjoys a “ghostly afterlife”: Deudney, D. (2007). Bounding power: Republican security theory from the polis to the global village. Princeton University Press.

Republican polities tend to be rare and relatively fragile: Deudney, D. (2000). Geopolitics as theory: Historical security materialism. European Journal of International Relations, 6(1), 77–107.

“Republicanism is an institutionalized system of decentralized power constraint”: Deudney, D. (1995). The Philadelphian system: Sovereignty, arms control, and balance of power in the American states-union, circa 1787–1861. International Organization, 49(2), 191–228. Retrieved from

“Every man invested with power is apt to abuse it”: Montesquieu. The spirit of laws.

“You must first enable the government to control the governed; and next oblige it to control itself”: Madison, J. (1788). Federalist no. 51: The structure of the government must furnish the proper checks and balances between the different departments. The New-York Packet.

These devices are a form of “friction” introduced into political processes: Stein, J. G. (2002). The Cult of Efficiency. House of Anansi Press.

Some oversight bodies have simply been eliminated altogether: Brookings Institution. (2020, May 22). Tracking deregulation in the Trump era. Retrieved from; Pulido, L., Bruno, T., Faiver-Serna, C., & Galentine, C. (2019). Environmental deregulation, spectacular racism, and white nationalism in the Trump era. Annals of the American Association of Geographers, 109(2), 520-532.

What Levitsky and Ziblatt call “the soft guardrails of democracy”: Levitsky, S., & Ziblatt, D. (2018). How democracies die: What history reveals about our future. Crown.

Numerous proposals worldwide to employ cell location data to assist in the effort to combat the spread of COVID-19: Glanz, J., Carey, B., Holder, J., Watkins, D., Valentino-DeVries, J., Rojas, R., & Leatherby, L. (2020, April 2). Where America didn’t stay home even as the virus spread. Retrieved from; Landau, S. (2020, March 25). Location surveillance to counter COVID-19: Efficacy is what matters. Retrieved from; Anderson, R. (2020, May 12). Contact tracing in the real world. Retrieved from; Sapiezynski, P., Pruessing, J., & Sekara, V. (2020). The Fallibility of Contact-Tracing Apps. arXiv preprint arXiv:2005.11297.

Such apps will be wide open to malfeasance that could distort the utility of the data: Anderson. Contact tracing in the real world.

The safeguards around them must be exceptionally strong: Geist, M. (2020, March 24). How Canada should ensure cellphone tracking to counter the spread of coronavirus does not become the new normal. Retrieved from; The turn to apps to solve contact tracing challenges during the COVID pandemic is a good example of what Evgeny Morozov calls “technological solutionism.” See Morozov, E. (2013). To save everything, click here: The folly of technological solutionism. PublicAffairs.

“It is insufficient to say that a comprehensive system for control and use of targeted surveillance technologies is broken”: United Nations Office of the High Commissioner for Human Rights. (2019, May 28). Surveillance and human rights: Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression. Retrieved from; See also Penney, J., McKune, S., Gill, L., & Deibert, R. J. (2018). Advancing Human-Rights-by-Design in the Dual-Use Technology Industry. Journal of International Affairs, 71(2), 103-110.

States purchasing spyware are at liberty to abuse it: Anstis, S., Deibert, R. J., & Scott-Railton, J. (2019, July 19). A proposed response to the commercial surveillance emergency. Retrieved from

Government military, intelligence, and law enforcement agencies have . . . stockpiled knowledge of software bugs: Deibert, R. (2014, November 25). The cyber security syndrome. Retrieved from; Deibert, R. (2013). Divide and rule: Republican security theory as civil society cyber strategy. Georgetown Journal of International Affairs, 39–50. See also Deibert, R. J., & Rohozinski, R. (2010). Risking security: Policies and paradoxes of cyberspace security. International Political Sociology, 4(1), 15-32.

These could include mandated transparency and reporting, greater accountability to legislators, and independent oversight bodies: Parsons, C. (2019). “Once more, preventing the breach: The rationale, and implications of, adopting human security practices for vulnerabilities equities process policies” [Working paper]; Herpig, S., & Schwartz, A. (2019, January 4). The future of vulnerabilities equities processes around the world. Retrieved from; House, W. (2017). Vulnerabilities equities policy and process for the United States government. White House Report.

Abuses and built-in discrimination around the use of some of these technologies today: O’Neil, C. (2016). Weapons of math destruction. Crown Random House; Tanovich, D. (2006). The colour of justice. Irwin Law; Ontario Human Rights Commission. (2017, April). Under suspicion: Research and consultation report on racial profiling in Ontario. Retrieved from; Ontario Human Rights Commission. (2018, November). A collective impact: Interim report on the inquiry into racial profiling and racial discrimination of Black persons by the Toronto Police Service. Retrieved from; Howe, M., & Monaghan, J. (2018). Strategic incapacitation of Indigenous dissent: Crowd theories, risk management, and settler colonial policing. Canadian Journal of Sociology, 43(4), 325–348. Retrieved from; Facial recognition software trained predominantly on the faces of white and lighter-skinned people may be less capable of accurately identifying individuals with darker skin tones. See Molnar & Gill. Bots at the gate; Garvie, C., Bedoya, A., & Frankle, J. (2016). The perpetual line-up. Georgetown Law: Center on Privacy and Technology; Klare, B. F., Burge, M. J., Klontz, J. C., Bruegge, R. W. V., & Jain, A. K. (2012). Face recognition performance: Role of demographic information. IEEE Transactions on Information Forensics and Security, 7(6), 1789–1801; Bedoya, A. (2020). Privacy as Civil Right. New Mexico Law Review, 50(3). Available at SSRN

A recurring theme in science fiction but one that seems increasingly less far-fetched: Deudney, D. (2020). Dark skies: Space expansionism, planetary geopolitics, and the ends of humanity. Oxford University Press; Kerr, I. R., Calo, R., & Froomkin, M. (Eds.). (2016) Robot law. Edward Elgar; Harari, Y. N. (2016). Homo Deus: A brief history of tomorrow. Random House; Bostrom, N. (2003). Ethical issues in advanced artificial intelligence. Science fiction and philosophy: from time travel to superintelligence, 277-284.

A wholesale ban on the use of AI and facial recognition until proper accountability mechanisms are in place: Fight for the Future. (n.d.). Ban facial recognition. Retrieved June 16, 2020, from; Schneier, B. (2020, January 20). We’re banning facial recognition. We’re missing the point. Retrieved from

Restraints . . . can begin in one or several jurisdictions and then develop more broadly: Lu, D. (2020, January 23). It’s too late to ban face recognition — Here’s what we need instead. Retrieved from; Access Now. (May 16, 2018). Toronto declaration: Protecting the rights to equality and non-discrimination in machine learning systems. Retrieved from; Université de Montréal. Montreal declaration for responsible AI. Retrieved from; Fairness, Accountability, and Transparency in Machine Learning. (n.d.). Principles for accountable algorithms and a social impact statement for algorithms. Retrieved June 16, 2020, from <>; See also Fidler, M. (2020). Local Police Surveillance and the Administrative Fourth Amendment. Santa Clara Computer and High Technology Law Journal retrieved from

Erik Learned-Miller . . . feels that the negative aspects of these technologies are growing and simply too dangerous to leave unregulated: Gershgorn, D. (2019, July 26). An A.I. pioneer wants an FDA for facial recognition. Retrieved from

Since 9/11 . . . the most secretive and martial wings of the state have ballooned in size: Deibert. The cyber security syndrome; It is also very important to note that the mere recognition by citizens that their governments are undertaking mass surveillance can have a chilling effect on free expression. See Penney, J. W. (2017). Internet surveillance, regulation, and chilling effects online: A comparative case study. Internet Policy Review, 6(2),* *22.

The platforms’ legal appropriation of users’ data: Cohen.* Between truth and power. See also the classic Shiva, V. (2001). *Protect or plunder?: Understanding intellectual property rights. Zed Books.

“There is a theoretical point — call it the Skinnerlarity”: Wu, T. (2020, April 9). Bigger Brother. Retrieved from

The EU’s General Data Protection Regime . . . and California’s Consumer Privacy Act are by far the most well known: On GDPR, see Bennett, C. J. (2018). The European General Data Protection Regulation: An instrument for the globalization of privacy standards? Information Polity, 23(2), 239–246; Keller, D. (2018). The Right Tools: Europe’s Intermediary Liability Laws and the EU 2016 General Data Protection Regulation. Berkeley Tech. LJ, 33, 287; Hartzog, W., & Richards, N. M. (2020). Privacy’s constitutional moment and the limits of data protection. Boston College Law Review, 61(5), 1687.; See also Farivar, C. (2018). Habeas data: privacy vs. the rise of surveillance tech. Melville House. For a detailed analysis of the way Facebook has adapted to German regulations, see Wagner, B., Rozgonyi, K., Sekwenz, M. T., Cobbe, J., & Singh, J. (2020, January). Regulating Transparency? Facebook, Twitter and the German Network Enforcement Act. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 261-271).

A true “codified antisurveillance regime”: Wu. Bigger Brother; Balkin, J. M., & Zittrain, J. (2016, October 3). A grand bargain to make tech companies trustworthy. Retrieved from; Hartzog, W. (2018). Privacy’s blueprint: The battle to control the design of new technologies. Harvard University Press.

A “protective counter movement”: Cohen. Between truth and power; Polanyi, K. (2001). The great transformation: The political and economic origins of our time. Boston, MA: Beacon Press. See also Cox, R. W. (1983). Gramsci, hegemony and international relations: an essay in method. Millennium, 12(2), 162-175.

Laws pertaining specifically to the freewheeling data broker industry: Pasquale, F. (2014, October 17). The dark market for personal data. New York Times; Schneier. Data and Goliath.

Hasty decisions are usually bad: Rosen, J. (2018, October). America is living James Madison’s nightmare. Retrieved from

A technical tweak a group of Brazilian researchers explicitly described as introducing friction into the application: Romm, T. (2020, March 2). Fake cures and other coronavirus conspiracy theories are flooding WhatsApp, leaving governments and users with a ‘sense of panic.’ Retrieved from;de Freitas Melo, P., Vieira, C. C., Garimella, K., de Melo, P. O. V., & Benevenuto, F. (2019, December). Can WhatsApp counter misinformation by limiting message forwarding? Retrieved from McGeveran, W. (2013). The Law of Friction. In University of Chicago Legal Forum (Vol. 2013, No. 1, p. 3); Cohen, J. E. (2012). Configuring the networked self: Law, code, and the play of everyday practice. Yale University Press; Ohm, P., & Frankle, J. Desirable Inefficiency. Florida Law Review, 70(4), 777, retrieved from See the excellent overview by Goodman, E. (2020). “Digital Information Fidelity and Friction.” Knight First Amendment Institute at Columbia University. Retrieved from

They even issued a rare joint statement: Statt, N. (2020, March 16). Major tech platforms say they’re ‘jointly combating fraud and misinformation’ about COVID-19. Retrieved from

“Platforms should be forced to earn the kudos they are getting”: Douek, E. (2020, March 25). COVID-19 and social media content moderation. Retrieved from See also Douek, E. (2020). The Free Speech Blind Spot: Foreign Election Interference on Social Media. Combating Election Interference: When Foreign Powers Target Democracies (Oxford University Press, forthcoming 2020).

Mandatory or poorly constructed measures could be perverted as an instrument of authoritarian control: Lim, G., & Donovan, J. (2020, April 3). Republicans want Twitter to ban Chinese Communist Party accounts. That’s a dangerous idea. Retrieved from; Lim, G. (2020). Securitize/counter-securitize: The life and death of Malaysia’s anti-fake news act. Retrieved from For a lengthy and very useful discussion, see Douek, E. (2020). The Rise of Content Cartels. Knight First Amendment Institute at Columbia, Retrieved from See also Klonick, K. (2017). The new governors: The people, rules, and processes governing online speech. Harv. L. Rev., 131, 1598.

Some commentators have called for additional exceptions to Section 230’s immunity clauses: Sylvain, O. (2018, April 1). Discriminatory designs on user data. Retrieved from; Citron, D. K., & Penney, J. (2019, January 2). When law frees us to speak. Fordham Law Review, 87(6). Retrieved from; Citron, D.K., & Wittes, B. (2018). The problem isn’t just backpage: Revising Section 230 immunity. Georgetown Law Tech Review, *453. Retrieved from; The most accessible and comprehensive account of Section 230 of the CDA is Kosseff, J. (2019). *The twenty-six words that created the internet. Cornell University Press.

The key will be to make sure that social media platforms manage content in ways that are transparent: Kaye. Speech police.

“Shaped by geometry rather than natural morphology”: Deudney. The Philadelphian system.

Brandeis “believed that great economic power results in immense political power”: Wu, T. (2018, November 10). Be afraid of economic ‘bigness.’ Be very afraid. Retrieved from

Amazon . . . “the titan of twenty-first century commerce”: Khan, L. M. (2017). Amazon’s antitrust paradox. Yale Law Journal, 126(3), 710; Moore, M., & Tambini, D. (Eds.). (2018). Digital dominance: the power of Google, Amazon, Facebook, and Apple. Oxford University Press.

Bezos was named . . . the “richest man in modern history”: Au-Yeung, A. (2018, October 3). How Jeff Bezos became the richest person in America and the world. Retrieved from

Amazon, Facebook, and Google spent nearly half a billion dollars on lobbying efforts in Washington: Romm, T. (2020, January 22). Amazon, Facebook spent record sums on lobbying in 2019 as tech industry ramped up Washington presence. Retrieved from

Justice Brandeis lamented the “curse of bigness”: United States v. Columbia Steel Co., 74 F. Su671 (D. Del. 1947).

“Antitrust law . . . was meant fundamentally as a kind of constitutional safeguard”: Wu. Be afraid; See also Hughes, C. (May 9, 2019). It’s time to break up Facebook. Retrieved from; Srinivasan, D. (2019). The antitrust case against Facebook: A monopolist’s journey towards pervasive surveillance in spite of consumers’ preference for privacy. Berkeley Business Law Journal, 16(1), 39–101. See also the excellent collection of essays curated by the Knight First Amendment Institute at Columbia University on the topic of anti-trust: “The Tech Giants, Monopoly Power, and Public Discourse,” retrieved from

That the large tech platforms be designated as public utilities: Economist. (2020, April 4). Winners from the pandemic — Big tech’s covid-19 opportunity. Retrieved from; Compare with Crawford, S. (2018). Calling Facebook a utility would only make things worse. Wired. Retrieved from See also Feld, H. (2019). The Case for the Digital Platform Act: Market Structure and Regulation of Digital Platforms. Roosevelt Institute. Retrieved from

“These firms have done things that, say, a litany of mini-Googles could not have done”: Scott, M. (2020, March 25). Coronavirus crisis shows Big Tech for what it is — a 21st century public utility. Retrieved from; The Economist. (2020, April 4). Winners from the pandemic - Big tech’s covid-19 opportunity. Retrieved from; Ghosh, D. (2019). Don’t Break Up Facebook—Treat It Like a Utility. Harvard Business Review, 30, retrieved from

“A well-regulated Militia, being necessary to the security of a free State”: United States of America. U.S. Const. amend. II.

Another recessed power might be “data portability” and “interoperability” requirements: Gasser, U. (2015, July 6). Interoperability in the digital ecosystem.; Doctorow, C. (July 11, 2019). Interoperability: Fix the internet, not the tech companies. Retrieved from On “right to review” (or, alternatively, “right to explanation”), see Kaminski, M. E. (2019). The right to explanation, explained. Berkeley Tech. LJ, 34, 189. For a study on how app vendors respond to data access requests, see Kröger, J. L., Lindemann, J., & Herrmann, D. (2020, August). How do App Vendors Respond to Subject Access Requests? A Longitudinal Privacy Study on iOS and Android Apps. In Proceedings of the 15th International Conference on Availability, Reliability and Security (pp. 1-10). For a very detailed over of interoperability as a tool of competition among tech platforms, see Brown, I. (2020). Interoperability as a tool for competition regulation. Open Forum Academy. Retrieved from

The COVID emergency may have helped kick the right-to-repair movement into high gear: Motherboard Staff. (2020, March 20). The world after this. Retrieved from; On “right to repair” and U.S. legislation, see Moore, D. (2018). You Gotta Fight for Your Right to Repair: The Digital Millennium Copyright Act’s Effect on Right-to-Repair Legislation. Tex. A&M L. Rev., 6, 509; Fowler, G. A. (2015). We Need the Right to Repair Our Gadgets. WALL ST. J.(Sept. 8, 2015, 3: 04 PM), https://www. wsj. com/articles/we-need-the-right-to-repair-our-gadgets-1441737868

Technicians scrambled to circumvent the software and other controls: Doctorow, C. (2020, March 19). Right to repair in times of pandemic. Retrieved from

Active forms of resistance . . . may be appropriate: For an argument for the value of “conscience-driven” lawbreaking, see Schneier. Data and Goliath; See also Deibert, R. J. (2003). Black code: Censorship, surveillance, and the militarisation of cyberspace. Millennium, 32(3), 501–530; Menn, J. (2019). Cult of the dead cow: How the original hacking supergroup might just save the world. PublicAffairs; Scheuerman, W. E. (2014). Whistleblowing as civil disobedience: The case of Edward Snowden. Philosophy & Social Criticism, 40(7), 609-628.

What most observers saw as a “war on whistleblowing”: Risen, J. (2014). Pay any price: Greed, power, and endless war. Houghton Mifflin Harcourt. For whistleblowing in general, see QC, J. B., Fodder, M., Lewis, J., & Mitchell, J. (2012). Whistleblowing: Law and practice. OUP Oxford; Delmas, C. (2015). The ethics of government whistleblowing. Social Theory and Practice, 77-105.

Well-intentioned moderation could very easily slide into heavy-handed suppression: For more on how strong content moderation requirements for social media platforms could backfire, see Donahoe, E. (2017, August 21). Protecting democracy from online disinformation requires better algorithms, not censorship. Retrieved from; MacKinnon, R. (2012). Consent of the networked: The worldwide struggle for internet freedom. Basic Books; See also York, J. C. (2010). Policing content in the quasi-public sphere, in Deibert, R., Palfrey, J., Rohozinski, R., & Zittrain, J. (2010). Access controlled: The shaping of power, rights, and rule in cyberspace (p. 634). the MIT Press; DeNardis, L., & Hackl, A. M. (2015). Internet governance by social media platforms. Telecommunications Policy, 39(9), 761-770.

According to Deudney: Deudney, The Philadelphian system. See also Woodard, C. (2020). Union: The Struggle to Forge the Story of United States Nationhood. Viking.

To turn this around would require a major rethink: On the importance of public education and civics as a way to create a healthy online public sphere, see Greenspon, E., & Owen, T. (2018). Democracy divided: Countering disinformation and hate in the digital public sphere. Public Policy Forum. Retrieved from; Lucas, E., & Pomerantsev, P. (2016). Winning the information war: Techniques and counter-strategies to Russian propaganda in Central and Eastern Europe. Center for European Policy Analysis; Bjola, C., & Papadakis, K. (2020). Digital propaganda, counterpublics and the disruption of the public sphere: The Finnish approach to building digital resilience. Cambridge Review of International Affairs, 1–29; Cederberg, G. (2018). Catching Swedish phish: How Sweden is protecting its 2018 elections. Belfer Center for Science and International Affairs; Bulger & Davison. The promises, challenges, and futures of media literacy.

Environmentalism’s ideals — getting “back to nature”:* Davis, W. (2009). *The Wayfinders. House of Anansi Press; Deudney and Mendenhall, Green Earth; See also Tong, Z. (2019). The Reality Bubble: Blind Spots, Hidden Truths and the Dangerous Illusions that Shape Our World. Canongate Books. For a discussion of how “data exhaust” might be reoriented into a planetary project to build new accounting practices to achieve sustainability, see Edwards, P. N. (2017). Knowledge infrastructures for the Anthropocene. The Anthropocene Review, 4(1), 34-43.

“It is this human right . . . with which the University has a duty above all to be concerned”: Office of the Governing Council. (n.d.). University of Toronto Statement of Institutional Purpose. Retrieved June 1, 2020, from

“Researchers who test online platforms for discriminatory and rights-violating data practices”: American Civil Liberties Union. (2020, March 28). Federal court rules ‘big data’ discrimination studies do not violate federal anti-hacking law. Retrieved from; For an examination of different types of opacity around machine learning and algorithms, see Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society, 3(1), 2053951715622512.

An alternative “human-centric” approach to cybersecurity: Deibert, R. (2018). Toward a human-centric approach to cybersecurity. Ethics & International Affairs, 32(4), 411–424.; See also Maschmeyer, L., Deibert, R. J., & Lindsay, J. R. (2020). A tale of two cybers-how threat reporting by cybersecurity firms systematically underrepresents threats to civil society.* Journal of Information Technology & Politics*, 1-20.

I witnessed first-hand well-intentioned and hard-working government bureaucrats: Freedom Online Coalition. (2016, October). Freedom Online coalition statement on a human rights based approach to cybersecurity policy making. Retrieved from; See also DeNardis, L. (2014). The global war for internet governance. Yale University Press; Nye, J. S. (2014). The regime complex for managing global cyber activities (Vol. 1). Belfer Center for Science and International Affairs, John F. Kennedy School of Government, Harvard University.