How to poison the data Big Tech uses to monitor you
In a new paper presented to the Association for Computing Machinery’s Conference on Fairness, Accountability and Transparency Next week, researchers, including PhD students Nicholas Vincent and Hanlin Li, come up with three ways the public can harness this to their advantage:
- Data keystrokes, inspired by the idea of labor strikes, which involve the retention or deletion of your data so that a tech company cannot use it – leaving a platform or installing privacy tools, for example.
- Data poisoning, which involves the contribution of meaningless or harmful data. AdNauseam, for example, is a browser extension that clicks on every ad served to you, thus muddling Google’s ad targeting algorithms.
- Conscious contribution to data, which implies giving meaningfull data to a platform competitor you want to protest, for example by uploading your Facebook photos to Tumblr instead.
People already use many of these tactics to protect their own privacy. If you’ve ever used an ad blocker or other browser extension that modifies your search results to exclude certain websites, you’ve engaged in data typing and gotten an agency over your data usage. But as Hill found out, sporadic individual actions like these don’t do much to get tech giants to change. their behaviours.
What if millions of people coordinate to poison the data of a tech giant? It might just give them some leverage to make their claims.
There may have already been a few examples. In January, millions of users deleted their WhatsApp accounts and switched to competitors like Signal and Telegram after Facebook announced it would start sharing WhatsApp data with the rest of the business. The exodus prompted Facebook to delay its policy is changing.
Just this week, Google also announced that it would stop tracking people on the web and targeting ads to them. While it’s unclear whether this is a real change or just a rebranding, Vincent says, it’s possible that the increased use of tools like AdNauseam contributed to this decision by degrading the efficiency of the company’s algorithms. (Of course, that’s ultimately hard to say. “The only person who really knows how much a move of data leverage has impacted a system is the tech company,” he says.)
Vincent and Li believe that these campaigns can complement strategies such as political advocacy and organizing workers into the movement to resist Big Tech.
“It’s exciting to see this kind of work,” says Ali Alkhatib, a researcher at the Center for Applied Data Ethics at the University of San Francisco, who was not involved in the research. “It was really interesting to see them think about the collective or holistic view: we can play with the well and make claims with this threat, because it’s our data and everything goes well together.”