A disturbing piece of reporting from Gizmodo (via /.) on the adoption by many US school districts of digital forensic tools to retrieve content from their students’ mobile devices. Of course, such technology was originally developed as a counter-terrorism tool, and then trickled down to regular domestic law enforcement. As we have remarked previously, schools have recently been on the bleeding edge of the social application of intrusive technology, with all the risks and conflicts it engenders; in this instance, however, we see a particularly egregious confluence of technological blowback (from war in the periphery to everyday life in the metropole) and criminal-justice takeover of mass education (of school-to-prison-pipeline fame).
Given the recent salience of news on surveillance and surveillance capitalism, it is to be expected that there would be rising interest in material, technical countermeasures. Indeed, a cottage industry of surveillance-avoidance gear and gadgetry has sprung up. The reviews of these apparatuses tend to agree that the results they achieve are not great. For one thing, they are typically targeted at one type of surveillance vector at a time, thus requiring a specifically tailored attack model rather than being comprehensive solutions. Moreover, they can really only be fine-tuned properly if they have access to the source code of the algorithm they are trying to beat, or at least can test its response in controlled conditions before facing it in the wild. But of course, uncertainty about the outcomes of surveillance, or indeed about whether it is taking place to begin with, is the heart of the matter.
The creators of these countermeasures themselves, whatever their personal intellectual commitment to privacy and anonymity, hardly follow their own advice in eschewing the visibility the large internet platforms afford. Whether these systems try to beat machine-learning algorithms through data poisoning or adversarial attacks, they tend to be more of a political statement and proof of concept than a workable solution, especially in the long term. In general, even when effective, using these countermeasures is seen as extremely cumbersome and self-penalizing: they can be useful in limited situations for operating in ‘stealth mode’, but cannot be lived in permanently.
If this is the technological state of play, are we destined to a future of much greater personal transparency, or is the notion of hiding undergoing an evolution? Certainly, the momentum behind the diffusion of surveillance techniques such as facial recognition appears massive worldwide. Furthermore, it is no longer merely a question of centralized state agencies: the technology is mature for individual consumers to enact private micro-surveillance. This sea change is certainly prompting shifts in acceptable social behavior. But as to the wider problem of obscurity in our social lives, the strategic response may well lie in a mixture of compartimentalization and hiding in plain sight. And of course systems of any kind are easier to beat when one can target the human agent at the other end.
Technology policy is often characterized as an area in which governments play catch-up, both cognitively and resource-wise, with the private sector. In these two recent cases, otherwise quite far apart both spatially and thematically, law enforcement can be seen to flip this script by attempting to pin responsibility for social externalities on those it can reliably target: the victims and the small fry. Whether it is criminalizing those who pay to be rid of ransomware or rounding up the café owners who failed to participate in the State’s mass surveillance initiatives, the authorities signal the seriousness of their intentions with regards to combating social ills by targeting bystanders rather than the actual perpetrators. Politically, this is a myopic strategy, and I would not be surprised if it generated a significant amount of pushback.
To no-one’s surprise, the Department of Homeland Security and the Customs and Border Patrol have become victims of successful hacks of the biometric data they mass-collect at the border. The usual neoliberal dance of private subcontracting of public functions further exacerbated the problem. According to the DHS Office of the Inspector General,
[t]his incident may damage the public’s trust in the Government’s ability to safeguard biometric data and may result in travelers’ reluctance to permit DHS to capture and use their biometrics at U.S. ports of entry.
No kidding. Considering the oft-documented invasiveness of data harvesting practices by the immigration-control complex and the serious real-world repercussions in terms of policies and ordinary people’s lives, the problem of data security should be front-and-center in public policy debates. The trade-off between the expected value to be gained from surveillance and the risk of unauthorized access to the accumulated information (which also implies the potential for the corruption of the database) must be considered explicitly: as it is, these leaks and hacks are externalities the public is obliged to absorb because the agencies have scant incentive to monitor their data troves properly.
Politico.eu recently ran an interview with Ciaran Martin, the outgoing chief of the UK’s National Cyber Security Centre. In it, Martin raises the alarm against Chinese attempts at massive data harvesting in the West (specifically in regard to the development of AI). This issue naturally dovetails with the US debate on the banning of TikTok. Herein lies the problem. Both national security agencies and major social media companies have endeavored to normalize perceptions of industrial data collection and surveillance over the past decade or two: that public opinion might be desensitized to the threat posed by foreign actors with access to similar data troves is therefore not surprising. The real challenge in repurposing a Cold War mentality for competition with China in the cyber domain today, in other words, is not so much a lag in Western –especially European– ICT innovation (Martin is himself slipping into a pantouflage position with a tech venture capital firm): it is a lack of urgency, of political will in the society at large, an apathy bred in part of acquiescence in surveillance capitalism.