Danny Snow’s digital donor card is bright yellow.
It has a spot for a signature, and affirmative statements about the holder’s commitment to donation that mirror those found on the organ donor card sent out by the UK’s National Health Service.
It has the look of a familiar object – even if it is a fiction.
It’s a way to frame a conversation about preferable futures, said Snow, a PhD researcher at University College Dublin, earlier this week at his home in Terenure.
“People examine this, use it, whatever, and then it opens up discussions about, like, sort of what if we had this?” he says.
People these days leave a trail of digital data through life – on social media profiles and posts, stores of photos, and health gadgets.
What do people think should happen to this after death? Snow’s project asks. “I want to explore people's values around data,” he says.
To draw out the values people already have but might not be aware of, because they have had no opportunity to articulate them, he says.
It may also be that they don’t ascribe values yet to different kinds of data, he says. “Because it would be a brand-new situation.”
What is, or would be, data donation?
As Snow sees it, data donation would essentially mean choosing which data if any to share and with who. “Basically, giving consent and offering your data for research purposes.”
Somebody could opt to donate a phone to a technology researcher and let them download a data package to study social media practices, says Snow, “as opposed to accessing it from the outside”.
Health data could be donated the same way, he says.
Snow sees it as one response to the myth of “neutral” data, a way to give researchers better-quality data to analyse and to train AI tools.
Data isn’t neutral, he says. Medical data from a single hospital draws from a particular population, he says.
Datasets from countries with laxer data-sharing laws may be used more by researchers than datasets from other jurisdictions, he says. But if researchers were to draw on specific donations, they could ensure good-quality data, he says.
Recognition among health researchers of the potential of AI systems to diagnose and see patterns has amplified calls for health-related data donation, he says.
That has accelerated since researchers successfully built an AI system for predicting protein structures, called AlphaFold, which won them the 2024 Nobel Prize in chemistry.
There is complex health research for which, no matter how many people and how much money you throw at it, it would be impossible for a team to do, Snow says. “That’s kind of where I would see a clear use case of AI or machine-learning technologies.”
Thinking on it
Snow’s exploration of attitudes towards data, through the digital donor card and the survey on its sister website, has thrown up some curious findings.
“What might be considered highly personal or intimate data doesn't necessarily align with what we may think it is or what is legally protected in that way,” he says.
“From the responses I got, there's a distinction, quite a clear distinction, between data relating to the physical body and data relating to your sense of personhood or mind,” he says.
Generally, people were happy to share health data, that relating to the physical body, after death, he says.
Meanwhile though, people were really cautious about data such as their search histories being shared or donated after death, he says, with a lot of talk about being misrepresented.
“It's typically framed as, oh, that could come out about me, and I wouldn't be able to see it or experience it, or anything,” he says.
When you step back and look at what we actually share day to day, that seems counter-intuitive, he says. “In the sense that we are offering this information, even though it is considered highly personal.”
Which leads him to the idea that, maybe, certain data types such as medical scans that stay within a system could be shared more freely, he says.
While the more abstract data, used to find patterns over time and make inferences, should see more protections, he says. “To reflect more what people actually value about their data.”
What happens now?
Under GDPR, personal data is no longer classed as personal data when you die, Snow says. It’s up to each member state to come up with rules for posthumous data, he says.
That doesn’t mean it is a free-for-all for release of data though, he says.
He cites a court battle over the release of adoption records, relating to deceased birth parents – which one side argued could not be released because of the impact on existing living relatives.
That GDPR explicitly leaves out deceased individuals is problematic, says Carl Öhman, an assistant professor of political science at Uppsala University in Sweden. “Because you can’t fully disentangle the privacy of the dead or the privacy of the living.”
If parents send their DNA data to a company as genealogy research, he says, and the company owns their digital remains, then it can make detailed inferences about future generations.
Especially if combined also with other data they may have loaded up about offspring – from scans to school photos, he says.
Data after you die is treated in the same way as property, says Snow. “People inheriting it and getting access or ownership over it.”
But leaving a car to someone isn’t the same as leaving personal information, he says.
An organ gets used and there's an end point, he says. Data can ripple on, with potential impact for those you’re related to, he says.
Is it yours to donate?
The data donor card also prompts people to think about who actually has ownership of that data today – in many cases, private companies.
At the moment, with Facebook, Instagram and Apple, you can assign a legacy contact who can tend to posts and accounts within certain boundaries.
“But it’s obviously not used that much,” says Snow.
Google has an active account manager, which means if there’s nothing on it for a certain amount of time, they delete it, he says.
Research by Öhman has mapped out that we can expect many billions more dead accounts over time, says Snow. “It’s big numbers.”
Öhman says that data donor cards aren’t a bad idea but they do kind of play into a larger issue he’s concerned about.
The main political challenge for him at the moment is that people are constantly encouraged to approach data privacy as an individual good, he says.
“We’re encouraged to think of ourselves, to reduce ourselves into consumers of platforms, rather than citizens of a society,” he says. This turns us into individual actors with zero political power, he says.
The GPDR regime, data privacy in general, reduces it to individual preferences – and most people don’t care about their individual data privacy, he says.
“They’re like, why would Google care about where I have lunch? Or why would Instagram care about my aging face?” he says. “They don’t give a damn about those things but they do give a damn about where people had breakfast and what it looks like when people age.”
If you reduce the discussion to individual preferences of what’s going to happen to my data when I die, it’s not a solution to the larger question of what’s happening to our data as a generation when we die, he says.
An idea he has put forward, he says, is that of a “digital world heritage label”.
In this future, international bodies would assist in the management of large archives of digital remains, he says. “In exchange for the public getting some access to what is essentially society’s digital past.”
Breaking their monopoly
The solution to the fact that digital history is now concentrated in the hands of individuals who can monopolise that isn’t going to be found within that economic system, Öhman says.
“I’m hoping that the situation is about to get so bizarre where we’re basically gonna have to ask Elon Musk for permission to study our own personal and collective past, that we’re going to realise that, like, the entire setup is fraudulent,” he says.
In his book The Afterlife of Data, he theorises how society may end up challenging it all, he says. “I think if we look historically, there have been almost no stronger rationales for access and entitlement than the claim that my ancestors are buried in this land.”
That’s the winning argument in disputes over land, he says.
Take the transfer of Stonehenge from west Wales to the Salisbury Plain, which archeologists have hypothesised was carried out because the stones were the embodiment of the ancestors of those who moved them.
There are some signs that this feeling may surface in debates around the ownership of digital remains, he says.
The backlash then-Twitter, when they tried to remove the profiles of deceased users, he says. “People were like, no, no, no, no, no. You are not erasing my departed father or mother.”
Meanwhile, looking beyond individuals, you can imagine an event that reverberates through a nation’s culture – the #MeToo movement, or the marriage equality referendum.
Imagine X suddenly decided to delete all the data about that, he says. “We would, like, find it unacceptable if someone were to erase part of our collective history.”
Yet we have no legal claim to that data, he says.
“My argument is that we are going to have to erase data, like we cannot preserve everything,” Öhman says. “So we need some sort of principle to identify which data are worth preserving, which are not.”
A plurality of values should underpin those decision, he says – just as museums appraise collections by bringing in an archeologist, a historian, people representing descendents and so on.
“They will together have a discussion: what is worth keeping?” he says.
At the moment, the way we manage our collective digital heritage is only according to one principle, he says. “And that is the principle of capital.”
How individual wishes to be forgotten should be balanced with other principles depends on the situation, he says.
If a public figure wants to disappear from the web, it’s fair for society that the data must be retained, he says. “Because there is a societal interest in, you know, keeping Donald Trump’s tweets of press conferences or whatever.”
Learning from the card
One of the things Snow has learnt about himself with the data donor card, he says, is how he doesn’t – even with the work he does – really have a sense of jeopardy when it comes to use of his data both now and in its after life.
“I don’t see the harms of data outside quite specific things,” he says.
An example being monitoring of social media posts at the US border, or how the UK government used data post-Covid to assign grades to classes based on historical scores, he says.
But it’s difficult when it comes to aggregated data, he says. “It’s very difficult to have a sense of things when it’s a random cookie and you’re on Amazon, who already have so much information about you anyway.”
“I’m not saying it doesn’t have negative impacts but for me, anyway, I can’t get over that,” he says.