In conversation with... David Doat

How imaginaries shape our relationships to technologies ?

as applied to digital tracing

David Doat, Senior Lecturer, Philosophy, and Ethics, Technology and Transhumanism Chair at Université Catholique de Lille, explores the impact of social imaginaries on a technology’s acceptance, and conditions for the use of imaginaries as a political means of mobilization in the current crisis. He notes the particular interest of one of this pandemic’s key imaginaries, the imaginary of care, which challenges not only our traditional digital ethics but also our modes of technology governance.

From the outset of the COVID-19 crisis, digitally enabled contact tracing tools have stirred considerable social debate around the technical limitations on the effectiveness of digital tracing and the concerns it raises for the protection of individual rights. “These fears are fuelled by one of the most archetypal techno-social imaginaries in our collective experience of the crisis — the imaginary of the surveillance society and ‘Big Brother’, which runs through many works of science fiction, such as George Orwell’s famous 1984,” notes philosophy senior lecturer David Doat. “That imaginary is also conjured very clearly in people’s minds by the control systems used by totalitarian regimes or the systematic carding of populations or minorities in certain countries. This can lead to our seeing any tracing technology as a potential threat to the fundamental values our ethics and our digital legislation are based on,” comments Doat.

And those fears are not unfounded. There are many real instances where citizens have not been informed of the background use that some companies make of their data, such as sharing personal addresses, phone numbers or email addresses. “But now, across Europe, the General Data Protection Regulation (GDPR) requires companies to be transparent about how their customers’ data is used,” stresses Doat.

That being said, distrust persists, in particular with regard to back-office applications that routinely use our geolocation data to advertise goods and services. “In response to the health crisis, new tracing technologies have been brought to market,” says Doat, “but the imaginary of the surveillance society may be creating systemic distrust around these kinds of applications, even though we could most certainly come up with a compromise solution that would help us fight the spread of the coronavirus.”

Imaginaries must absolutely be taken seriously in attempting to understand societal issues. Because, while we often oppose the imaginary and the real, we quickly reach the limits of that type of binary classification. In fact, what we think of as real has invariably already been shaped by the imaginary.

This is certainly so in the world science: to form any sense of certain dimensions of reality, even our most sophisticated capabilities have to fall back on theoretical models, and on abstract ideas such as probability, statistical curves or mathematical quantities.

And the same holds true for society, institutions and politics. In The Construction of Social Reality (1995), John Searle, a prominent philosopher of language and the mind, pondered, “How can there be an objective world of money, ..., elections, football games, ... in a world that consists ... of physical particles in fields of force, and in which some of these particles are organized into systems that are conscious biological beasts, ....?” Searle maintains that the answer lies in our cognitive capacities to share collective intentions and grant symbolic existence to objects that we recognize as giving structure to our social existence.

What does this line of thought have to do with our relationship to digital technologies in the pandemic crisis? Firstly, it underscores that all technologies are social objects whose invention and use are shaped by imaginaries. “But the imaginaries that motivate those who design technologies are not always in step with the imaginaries of the people they’re intended for,” points out Doat. “Disconnects can occur. I’m thinking of the 5G advertising campaign launched during the first lockdown by Belgian telecom operator Proximus that made 5G look like the almost instant answer to both consumer needs and increased productivity. Far from offering a desirable imaginary, the campaign — picturing a future society where all things are automated, anticipated, measured and algorithmized in a completely urbanized world with no hint of the environmental cost of such a future — looked more like a reflection of the same old same old dystopian and pseudo-futuristic imaginary… based on the visions of bygone times. We feel this sense of time warp because it’s only in recent years —months, maybe — that we have turned a corner into an era where fundamental generational aspirations are undergoing change, and where the social, environmental, and health impacts of new technologies need more and more to be taken into account in imagining desirable futures.”

For ethics specialists, the hits and misses of that 5G advertising campaign kick-off in Belgium are an excellent illustration of the issues surrounding the social acceptability of technologies. “Classic technocratic thinking tends to view social imaginaries as fantasy. That perception is flawed... because when doubts emerge around a technology, it can quickly become its own worst enemy. To avoid lack of clarity around controls that lead a segment of the public to identify those technologies as totalitarian, it’s absolutely essential to define both limits and a framework for them that safeguard democracy,” warns Doat.

Government control procedures for these applications already exist around the world. In response to the pandemic, a number of democratic countries implemented decentralized protocols that maintain anonymity and limit the use of data collected. “Belgium’s Coronalert tool, for instance, stores information on smartphones and only shares anonymized data via Bluetooth, with no location tracking,” Doat explains. “Other measures to secure and safeguard personal data and user privacy have been stringently put in place and the app has had to provide proof of compliance with them. So it’s clear there are ways to make sure that companies are actually collecting information for the good of the community, with no hidden agenda.”

But there’s one problem that won’t go away: digital tracing applications to date have been a universal flop — in terms of public buy-in for the IT-based tools available and, hence, results-wise — due to dismal download levels.

What’s the workaround? Maybe piggybacking digital tracing technologies onto a different imaginary. Within the imaginary of care, the idea of monitoring (surveillance), isn’t seen as a threat from the get-go, but as a way of addressing certain risks. In French, “surveiller” originated within the vocabulary of care and made its way into English: “Etymologically, ‘veiller sur’ is to ‘watch over’, to protect, safeguard from harm. On the political level, surveillance as used in the vocabulary of care is a legitimate means of ensuring public order and a duty enshrined from the outset in the rule of law,” says Doat.

“Not all imaginaries are equal in any given situation,” he believes. “If we say that the imaginary of war represents a conservative ideology and tends to view technology as a weapon, the imaginary of care on the other hand presents a vision of utopia that recognizes the pandemic crisis first and foremost as a public health problem that requires ways to organize the community and blueprints for governance, sharing information and the use of technology."

For Doat, the imaginary of care seems the best suited to the current landmark challenge we face, and he feels the pandemic crisis has shuffled the cards on how the West has been used to identifying and weighing the legal and ethical risks around digital technologies. “If a contact tracing technology can help, within a broader combined set of solutions (social, political, organizational and spread control and distancing measures, etc.), to save a significant number of lives, considerably reduce suffering, and preserve freedom of movement for as many people as possible (e.g., by diminishing the likelihood of further general lockdowns), shouldn’t our priority be the good of the community?” he wonders.

A policy of care demands greater transversality and horizontality in the sharing of information, as well as the development of types of technology governance that include all stakeholders. The examples offered by several Asian countries, such as Taiwan and South Korea, can provide some guidelines for us, albeit within limits, because they originated from democratic regimes we share a degree of political familiarity with. “In response to the pandemic, the digital tools rolled out to combat the spread of COVID-19 were fully integrated into healthcare communities. Healthcare providers and local officials worked together within their jurisdictions. The use of apps was more successful because they were better integrated and networked with a set of other solutions,” Doat states. That being said, part of the success of the Taiwanese and South Korean applications is that they were also authorized to use exact geolocation data to track individuals and offer guidance, which is a particularly sensitive issue in the West and in Europe because it raises flags around privacy and naturally comes with the potential for some form of abuse.

In June 2020, the WHO issued ethical guidelines with suggested principles to achieve the appropriate use and equitable social buy-in of tracking technologies consistent with the fundamental principles and rights of democratic societies. One of the suggestions was to include civil society stakeholders in oversight bodies. Their place would be to assess the proportionality of the proposed IT-based solution with the fundamental principles that citizens cherish, and they should have access across all stages of data processing, up to and including interpretation. “This would allow the experts and organization representatives forming that body to ascertain and, if necessary, certify that an application is fully consistent with citizens’ rights,” notes Mr. Doat, who sees a further option to explore — “We might also look at companies permanently opening up a more direct forum for easier dialogue with users of their tracking applications than is presently available, by facilitating user feedback and commentary and providing clear and transparent answers.” 

In the end, the Ethics, Technology and Transhumanism Chair leaves all the options open. “Somewhere between the two extremes of a repressive, state-controlled Chinese-style system of data sharing — which we absolutely do not want — and a pre-pandemic digital ethic, the West has the chance to seize the opportunity before us and commit to a middle road for digital technology ethics and governance that as yet remains to be fully imagined.”

Related Articles