I managed to catch a screening of the new Shalini Kantayya documentary, Coded Bias, through EDRi. It tells the story of Joy Bualomwini‘s discovery of systematic discrepancies in the performance of algorithms across races and genders. The tone was lively and accessible, with a good tempo, and the cast of characters presented did a good job showcasing a cross-section of female voices in the tech policy space. It was particularly good to see several authors that appear on my syllabus, such as Cathy O’Neil, Zeynep Tufekci, and Virginia Eubanks.
Category Archives: Surveillance and digital archives
Schools get into the phone-hacking business
A disturbing piece of reporting from Gizmodo (via /.) on the adoption by many US school districts of digital forensic tools to retrieve content from their students’ mobile devices. Of course, such technology was originally developed as a counter-terrorism tool, and then trickled down to regular domestic law enforcement. As we have remarked previously, schools have recently been on the bleeding edge of the social application of intrusive technology, with all the risks and conflicts it engenders; in this instance, however, we see a particularly egregious confluence of technological blowback (from war in the periphery to everyday life in the metropole) and criminal-justice takeover of mass education (of school-to-prison-pipeline fame).
Hiding
Given the recent salience of news on surveillance and surveillance capitalism, it is to be expected that there would be rising interest in material, technical countermeasures. Indeed, a cottage industry of surveillance-avoidance gear and gadgetry has sprung up. The reviews of these apparatuses tend to agree that the results they achieve are not great. For one thing, they are typically targeted at one type of surveillance vector at a time, thus requiring a specifically tailored attack model rather than being comprehensive solutions. Moreover, they can really only be fine-tuned properly if they have access to the source code of the algorithm they are trying to beat, or at least can test its response in controlled conditions before facing it in the wild. But of course, uncertainty about the outcomes of surveillance, or indeed about whether it is taking place to begin with, is the heart of the matter.
The creators of these countermeasures themselves, whatever their personal intellectual commitment to privacy and anonymity, hardly follow their own advice in eschewing the visibility the large internet platforms afford. Whether these systems try to beat machine-learning algorithms through data poisoning or adversarial attacks, they tend to be more of a political statement and proof of concept than a workable solution, especially in the long term. In general, even when effective, using these countermeasures is seen as extremely cumbersome and self-penalizing: they can be useful in limited situations for operating in ‘stealth mode’, but cannot be lived in permanently.
If this is the technological state of play, are we destined to a future of much greater personal transparency, or is the notion of hiding undergoing an evolution? Certainly, the momentum behind the diffusion of surveillance techniques such as facial recognition appears massive worldwide. Furthermore, it is no longer merely a question of centralized state agencies: the technology is mature for individual consumers to enact private micro-surveillance. This sea change is certainly prompting shifts in acceptable social behavior. But as to the wider problem of obscurity in our social lives, the strategic response may well lie in a mixture of compartimentalization and hiding in plain sight. And of course systems of any kind are easier to beat when one can target the human agent at the other end.
Data security and surveillance legitimacy
To no-one’s surprise, the Department of Homeland Security and the Customs and Border Patrol have become victims of successful hacks of the biometric data they mass-collect at the border. The usual neoliberal dance of private subcontracting of public functions further exacerbated the problem. According to the DHS Office of the Inspector General,
[t]his incident may damage the public’s trust in the Government’s ability to safeguard biometric data and may result in travelers’ reluctance to permit DHS to capture and use their biometrics at U.S. ports of entry.
No kidding. Considering the oft-documented invasiveness of data harvesting practices by the immigration-control complex and the serious real-world repercussions in terms of policies and ordinary people’s lives, the problem of data security should be front-and-center in public policy debates. The trade-off between the expected value to be gained from surveillance and the risk of unauthorized access to the accumulated information (which also implies the potential for the corruption of the database) must be considered explicitly: as it is, these leaks and hacks are externalities the public is obliged to absorb because the agencies have scant incentive to monitor their data troves properly.
Digital contact tracing: security theater to surveillance enabler
Back in the Spring, digital contact tracing was heralded as the hi-tech path out of the pandemic. With the benefit of six months of hindsight, the limitations of the approach have become clear [see Schneier for a concise summing-up of its shortcomings].
While digital contact tracing’s notional benefits seem to belong squarely in the realm of security theater (i.e., showing the public that Something Is Being Done), its potential for justifying intrusive surveillance remains intact. Two recent news items illustrate this dynamic. A small liberal arts college in Michigan is forcing its students to download a contact-tracing app (and apparently a security vulnerability-riddled one, at that) as a condition for being allowed on campus. Meanwhile, the delegates to the Republican National Convention reportedly are to wear “smart badges” (originally developed for tracking pallets) to record their movements through the convention venue in Charlotte. While higher education has long been a laboratory of choice for surveillance technology experimentation, I would have expected the libertarian wing of the GOP to kick up more of a fuss over this kind of intrusion.