The Biobank data leak
Privacy, knowledge and the humility of faith over sight
Our discussion today begins with the concerning news that a world-leading database of medical records, the UK’s Biobank, has suffered an embarrassing leak after hundreds of thousands of confidential records were put up for sale on a Chinese website. Biobank is a pioneering collection of data from half a million Brits, who altruistically donated their records, blood, DNA and more for medical research. But how can we balance the competing demands of strict medical confidentiality with the growing need for big data for both cutting-edge research and thrusting commercial development? Is privacy a particularly Christian idea anyway? Should we be celebrating these altruistic souls for their service despite the personal risks of leaks? And what does the Biblical narrative and the limitations God places on his image-bearers have to say about our modern grasping for all-encompassing knowledge?
Listen to the podcast directly below, or click here to find it on your podcast app of choice and subscribe to get new episodes every Wednesday sent straight to your device.
Transcript
[Transcribed with AI, there may be some very minor inaccuracies]
Tim Wyatt: Hello and welcome to another episode of Matters of Life and Death. As always, I’m Tim Wyatt, and I’m joined by my dad, John Wyatt. Hi, Dad.
John Wyatt: Hi, Tim.
Tim Wyatt: Today’s episode was inspired by an interesting news story here in the UK last week. Here’s how the Guardian newspaper reported it: “Private health records of half a million Britons offered for sale on Chinese website.”
This is a story that the confidential health records of 500,000 British volunteers, who gave their health data to a charity called UK Biobank, were discovered on sale on the Chinese website Alibaba, which is a bit like a Chinese eBay or Chinese Amazon, we should say. And that shouldn’t have happened.
John Wyatt: Yes, it’s a very interesting and embarrassing phenomenon for the UK Biobank, and there will be a certain amount of schadenfreude amongst critics, because many people had already predicted or expressed concerns about the security of data in these large databases which have been voluntarily given by members of the public for the public good.
It’s deeply embarrassing, I think, because UK Biobank has put a great deal of emphasis on its security and on the measures it’s taken to protect individuals’ very highly sensitive and confidential information.
Tim Wyatt: Yes. So what appears to have happened is that some researchers at Chinese universities, who had legitimate access to the Biobank data, managed somehow to get the data off the Biobank’s supposedly secure website and are now selling it online.
We are told no one actually bought this data online before the UK Government stepped in and worked with the Chinese authorities to get it taken down. But then a week later, just yesterday as we record this, there were further listings of health records put back up on Alibaba, and the science minister here in the UK has admitted that the government is bracing itself for further leaks.
So this is an ongoing problem. It seems that they’re not able to rein in rogue researchers who have got access to the data, smuggling it off the platform and trying to sell it online.
John Wyatt: Yes. I think we did talk about UK Biobank on at least one previous podcast, and in the show notes we’ll perhaps link to some of the previous episodes. But it would be helpful, I think, to come back and review what this is and why it is such an important, scientifically and also commercially important, source of data.
Tim Wyatt: Yes. So the UK Biobank is a charity that was set up in a collaboration between the Department of Health here in the UK Government and a series of medical research charities. It started about 20 years ago, in the 2000s.
They recruited 500,000 volunteers, all aged between 40 and 69. It was completely voluntary, but if you wanted to take part, you went into a GP surgery or some kind of hospital and did an assessment where they took a questionnaire about your lifestyle, your medical history, your nutrition; physical measurements were taken; and then you donated samples of blood, urine, and saliva.
Your genome was then sequenced, and they had access to all your hospital records as well. This was all gathered and stored up into a huge database, effectively de‑identified. We’ll come back to that. They took out names and addresses. So if you had signed up, for instance, Dad, they would see that a 72‑year‑old man—are you 72 or 73?
John Wyatt:
Speaker 4: Personal! Excuse me, just giving away my personal medical information.
Tim Wyatt: Another leak has been exposed! I wouldn’t know your name or address, but they could see what blood type you are, what height and weight you are, how many times you’ve been to hospital or the GP, but also the details of your genome and your DNA and things like that.
John Wyatt: Yes, and what is so powerful about this is, first of all, the fact that we are matching DNA data to all this other huge amount of medical information, which enables researchers to make connections.
Secondly, this is being collected prospectively. In other words, the idea is that if I had joined—if I had joined UK Biobank when first approached—it would mean that they would send me regular questionnaires: “How are you doing? Are you still alive? And have you got cancer yet?”
Tim Wyatt:
John Wyatt: Exactly. And then if something happened to me—if I had a stroke—they would be able to match that, using the DNA, and compare it with how many other people in the UK Biobank had strokes.
Was there some feature about the DNA which made that more likely? Or was it a combination of genes and different aspects, all of which added up to help us understand why I had that stroke, or why I developed cancer, or why I had dementia, or whatever. So that is incredibly powerful.
Another huge strength is that, although it’s not officially part of the NHS, it’s all linked to the NHS. There is a kind of uniform collection of data across that population. This is where the UK is almost unique, because of the uniform collection of data that’s going on across the NHS, which affects virtually every member of the population.
So it’s, scientifically and medically, a very powerful prospective data collection resource.
Tim Wyatt: Yes, and you can see how, if you’re a researcher in Alzheimer’s, for example, the idea of being able to prospectively track people is very appealing. These are people who were basically in middle age when they gave their data, and you can track them over the second half of their life.
You can then see that this group of people developed Alzheimer’s at a certain age. Let’s wind the clock back and see, in 2006 when they gave their data, what their lifestyle was like, what their blood pressure was like, all these other things, and try and pull out inferences from this vast data pool.
It’s very appealing, isn’t it, if you’re a researcher, rather than saying, “Right, I want to try and find out something about Alzheimer’s, so I’ve got this pool of people who I know died of Alzheimer’s in their 80s and 90s—but how do I know what their life was like 50 years earlier?”
John Wyatt: Exactly. It is arguably the best prospectively collected data anywhere in the world. And it is still ongoing, because the people who signed up—the half a million people—most of them are still alive, and their data is continuing to be collected.
Speaker 5:
Tim Wyatt: The scheme itself is actually expanding. It was only last year that the government changed it so that GP records could now be attached, de‑identified, to the data. So that’s not just your visits to hospital, but also to family doctors and primary care physicians. That’s an even more valuable resource and a huge expansion of the data available to researchers.
John Wyatt: Absolutely right. I was just going to make the same point. As these things emerge and new developments occur, it becomes more and more powerful. The rest of the world is watching this with great interest, and it has been strongly supported—by the government, by leading research scientists, and so on. It’s a very strong collaboration.
The idea is that both academic researchers and commercial researchers can have access to this. Commercial companies, startup companies, and so on, can have access to de‑identified data. So the idea is that you strip off pieces of information such as the name, the address, the precise date of birth—anything which could allow someone to identify you.
You strip all that off. You give a unique coded number to represent that person. So from that point on, this is just participant number 754X2. There’s a separate database which matches that, so that, if needed, it’s always possible to identify the person again.
Of course, this also raises other ethical questions. If it turns out that John Wyatt, having given his DNA, is now flagged as being at increased risk of disease X, what do we do? Is there a duty to warn John Wyatt that he’s at increased risk of disease X?
Well, no—unless there is a possible treatment which he could take in advance. But if there is, and we now know that if he takes drug Y he might prevent getting condition X, then we’ve got an ethical duty to contact him.
But then it also turns out that this is very relevant to John Wyatt’s children. Has John Wyatt got any children? Because if it’s not just relevant for John Wyatt, but if he’s got any children, they really need to know that he’s got this very rare DNA variant that is going to impact them and their children.
So now, should we contact John Wyatt and ask him? Or, actually, we can see on our database who John Wyatt’s children are. There’s someone called Tim Wyatt, who’s a journalist. So should we contact Tim Wyatt and tell him that his dad’s got a rare variant? There are real ethical problems here.
Tim Wyatt: Yes. Once you start pulling on the thread, because of the way that DNA in particular is both utterly unique to each individual and yet also reveals this insight into what might happen to your health, and also to those related to you, the concept of total anonymity feels more diffuse, or even maybe impossible.
It’s one thing in the era when you’d say, “I’ll just share my blood type.” There are a limited number of blood types; it’s not that revealing. But once you’ve given a blood sample or spat into a tube today and your entire genome has been sequenced, that can never be unknown or undone. There is only one person of the eight billion on the planet that has my exact DNA sequence.
John Wyatt: And the point is that we can use AI to say, “Look, these are the 50 things we know about this person, but we don’t know their name and we can’t identify them.” This is what AI is supremely good at: pattern making, matching and recognition.
An AI system finds it relatively trivial to take those 50 variables and say, “You know what? This is 98% certain—it’s John Wyatt, and this is his address, and this is his date of birth, and these are his contact details.”
Tim Wyatt: Let’s get down to some of those ethical conundrums. What do you make of the idea that, fundamentally, people who have voluntarily chosen to take part in this and given all this data over, even though it was supposed to be de‑identified, did it because they want their data to be used?
They don’t mind when it’s used by official means—when researchers legitimately use it. Why would they mind if it was then sold on the black market? Okay, someone’s made some money off it, but fundamentally it’s going to be used for more research. It’s not really of interest otherwise. Why should they care?
I was very struck that when the BBC reported this, they found one Biobank volunteer who also happens to be famous—Guardian columnist Polly Toynbee—who told the BBC she was not worried. She said, quote:
“Biobank volunteers passionately believe that what they’re doing is incredibly valuable, that having this huge bank of information and data helps cure diseases and helps find the causes of diseases. I don’t think many people will be worried, because that information is anonymized. Maybe they could sell details of particular cases, but it won’t be with names or addresses or anything that leads back to particular people. So I don’t think this will rattle all the magnificent volunteers who got in for this.”
What would you say to Polly? Has she been a bit glib?
John Wyatt: I fear she is. It’s a very honourable sentiment, but it’s a bit glib, because, as we’ve just said, there’s no way of completely stopping—or it’s very difficult to prevent—de‑anonymization, to prevent AI tools breaking anonymization.
This is another area where there’s an arms race: between the data security experts trying to maintain anonymization, and the bad guys who are very keen to obtain data which, if it can be de‑anonymized, will open up all sorts of possibilities for abuse.
Providing that kind of richness of information about you, unfortunately, in this broken, fallen world where there seem to be state‑sponsored cyber‑warfare teams across the world, working 24/7 to try to find ways of breaking into Western society and Western individuals with criminal intent, is risky.
Tim Wyatt: Yes. And I think the way the world has changed since 2006, when it started, is that there is already a wealth of information available about people online.
The Guardian actually, a month before this leak was exposed, did their own investigation into Biobank. They found that inadvertently Biobank data had been uploaded to public research depositories dozens of times.
In many ways, it’s because a lot of people who fund medical research require that the underlying data be made publicly available as part of their funding agreements. So a researcher went away, downloaded some data from Biobank, and then when they uploaded their findings, they just accidentally didn’t remove the data. So it was available on these large public research archives that scientists use—not for sale, not with any malicious intent, but just because that is how science often works.
The Guardian basically reverse‑engineered it. They found a volunteer who said she just gave her month and year of birth and the month and year she had a hysterectomy, and The Guardian could identify her exact record from 400,000 records, and then read out to her, “I can tell you you also had these diagnoses and these things.”
The woman said, “Yes, that’s absolutely right. You found me.” Biobank’s response was, “This is a kind of false test, because you couldn’t have done it without having this knowledge of the hysterectomy in the first place.”
But what that doesn’t really take into account is that people already post public information online. You don’t need the willing participation of a volunteer. You could say, “Let’s trawl the internet, find someone who has said they’re going into hospital on Facebook or put it on social media,” and then use that to triangulate and cross‑reference.
As you say, a combination of social media, AI, and various other things means that de‑anonymizing this data is not nearly as difficult as Biobank would like their volunteers to believe.
John Wyatt: This does go to very fundamental conundrums we’ve talked about before: how central is privacy, including medical privacy, to human life, to a healthy society, and to individual human flourishing?
It’s an age‑old question. I have a particular take on this. To be honest, I’m very conflicted about it. With my medical‑ethical hat on, I know that patient confidentiality has been a massive preoccupation of physicians going back to the pre‑Christian era.
In fact, it was one of the outstanding innovations of the Hippocratic Oath, dating back 300 to 400 years BC. In the Hippocratic Oath, for the first time, physicians took a solemn, binding oath that said, “I will protect my patients’ information and anything I gain in my professional activities as a most holy secret.” I think that’s the wording.
The genius of the Hippocratic doctors was that they recognized that because they were trusted, they were entering people’s homes. They were privy to all kinds of scandalous information about their patients, because they were trusted as intimate companions, confidants, and confessors. That trust could only be maintained if their patients could know absolutely that anything said to their doctor would go to the grave.
For the doctor to discover some scurrilous and incredibly damaging information and then pass it on was to fundamentally damage their own patient. That insight—that patient confidentiality is a bedrock of medical care—has very, very deep roots, and I get that.
As a physician, I can remember times speaking to parents in the confidentiality of the consulting room, and it felt almost like a confessional. I remember sometimes a parent saying, “To be honest, I’ve never told anybody else this, but…” So the idea that protecting those confidences is at the heart of good professional medical practice is still the case today.
The General Medical Council has very strong regulations about patient confidentiality, and you could be sued, but also struck off the medical register, for serious breaches of confidentiality which are deemed unethical.
Tim Wyatt: It’s hard to see how modern medicine could have developed without that confidentiality baked in, in the same way that you can’t really see the justice system working if lawyers aren’t allowed to have frank, no‑holds‑barred conversations with their clients, and the clients know that it must remain sacrosanct within the consulting room.
I think it’s the same with medicine. How can you be a good GP if your patients don’t trust that they can share scandalous, shocking, stigmatizing information with you and know it will remain safe?
John Wyatt: And yet, bit by bit, that confidentiality has been pushed and pushed and pushed. You go to the GP and they have medical information, but now that medical information is going to be shared with the entire NHS, which is—what? I don’t know how many hundred thousand employees.
Tim Wyatt: Millions.
John Wyatt: And how do you restrict information once you enter the great NHS electronic medical records system? In a previous era, I can remember handwritten records in a locked filing cabinet in my office. No longer.
Tim Wyatt: And it’s not that long ago. Even I, as a parent, knew that our daughter’s medical records at one hospital, until literally this year, were a single paper file in a filing cabinet, and they’ve only just now moved onto an online database system. So this is a very recent shift.
I guess there are two goods that I see here. You have, as you say, the good of confidentiality, which underpins openness, frankness, and the trust between physician and patient. But then you have the other good of research, and research is not possible if data is kept as a “most holy secret.”
What Hippocrates couldn’t have known is that when we develop the scientific method and then, even more, in the computer age, there are things that you can only see when you compare the records of hundreds, thousands, or even millions of people. You just cannot know them by siloing everyone’s records independently.
When you are able to aggregate them, and particularly when you can do complicated computer‑science statistical analysis of them, you can pull out truth and develop new treatments that would not be possible if we maintained that “most holy secret” in its purest form. And that’s also a good.
Everyone wants more research.
John Wyatt: Absolutely right. Interestingly, the drive to help others through medical research is not just a vague altruistic thing; it can be very intense.
One of the things that I saw on a number of occasions as a doctor dealing with very tragic cases, where something completely catastrophic had happened to a loved baby—in particular, because my main research focus was on babies with brain damage—was that one of the few positive things parents could draw out of what was an absolute catastrophe was the fact that their baby was part of a research program.
They had gone through a pregnancy, everything had been fine, and then at the moment of delivery some catastrophe—an obstetric disaster—occurred. The baby had been short of oxygen and ended up severely brain damaged. We were the doctors looking after the baby, but we were also running a research program to try to understand the mechanisms of brain injury, to find new treatments to prevent this happening.
I can remember several parents saying to me, “The one thing I’m grateful for is that our baby was part of a research program, and therefore some other parents might benefit from the terrible thing that’s happened to us, and that actually means a lot to us.”
So I think this instinct for medical research is a very, very positive thing, and I certainly support that. But of course, people are putting an immense amount of trust in the researcher: that you are genuinely motivated for good, and that this will genuinely help other people. The thought that that trust might be betrayed is painful.
Tim Wyatt: And I think you have to step back and say it is kind of remarkable that half a million people decided to give up not just hours of their time and repeated follow‑up visits and tests and checks and things like that, but also give up that “most holy secret”—their confidential medical records, and more: to give up their DNA.
They did this without any recompense, and there’s no direct benefit to them whatsoever. But half a million British people said, “Yes, I can see the value in contributing to research. It’s vanishingly unlikely that the research done on my data will actually personally benefit me or anyone I love. I’ll probably be dead before most of the really valuable stuff emerges and these treatments come down the line, but I’m still going to go out of my way to generously donate both my time and, to an extent, my confidentiality.”
I suppose that’s what’s tragic about the Biobank leak story: to an extent, those altruistic half a million people have been let down.
John Wyatt: That’s right. Sadly, I think this is where a fundamental issue is about informed consent. Do I really know, before I sign on the dotted line, whether I am well‑informed about the possible benefits and risks?
What is so challenging about DNA data—again, this is an issue we’ve touched on before—is that for your ordinary person who’s not a genetic specialist, to really get their head round the implications of DNA data and what it might mean is very difficult.
In clinical practice, if someone has to make a clinical decision based on DNA data, they’ll often receive counselling from a specialist genetic counsellor, who may take 30 minutes or an hour or more, patiently talking it all through, showing diagrams, explaining the implications, and so on. And that kind of specialist genetic counselling is in very short supply. There are relatively few people who have that specialist training.
So many people volunteer DNA information without a very clear understanding of what they are doing. We’ve talked about this with commercial services.
Tim Wyatt: Yes, 23andMe and so on.
John Wyatt: Exactly.
Tim Wyatt: Tracking your family tree and then discovering you have another secret family you didn’t know about.
John Wyatt: Yes. So fundamentally I’m very supportive, and I think this is a very positive thing. There is a very strong altruistic desire which is part of our God‑given humanity, I suppose. Let’s try and think about a distinctively Christian perspective on all this.
Tim Wyatt: Well, that’s what I was going to say. Shouldn’t we fundamentally be celebrating this as, maybe, an echo of our Christian heritage? Obviously, the vast majority of the people in the UK Biobank weren’t Christians and didn’t ostensibly do it for Christian reasons, but I would argue that the desire to selflessly help others, with no reward to yourself, is a fundamentally Christian virtue.
I’m not saying it’s only Christians who exhibit it, of course not, but I think it is a fundamentally Christian virtue, and we as Christians, as a church, should be honouring and celebrating the fact that it still persists in our society: that we aren’t this ultra‑atomised, individualistic, hyper‑selfish community that we’re sometimes told we are.
Actually, this kind of project exists. And, going beyond that, look at something like the existence of altruistic kidney donation. That still happens—people still do that. There’s a risk: if you give your kidney to a stranger, there is a risk for you. It’s serious surgery. You didn’t need to do it. You’re not going to get paid—not in the UK anyway. And yet people still do that.
I think we should celebrate that. Maybe it’s the same thing with Biobank. Yes, there’s a risk. There’s a risk your data is hacked or stolen or de‑anonymized, but actually it’s a risk worth taking because of the positive good that altruistic, free gift of data can lead to. We should encourage Christians and non‑Christians alike to continue to press into medical research for the public common good it can lead to.
John Wyatt: Yes, absolutely. But to me it comes back to this principle of informed consent, which I think is a very Christian principle.
This goes all the way back to the history of medical research and the terrible history of forced medical research. Ever since then—of course, the most egregious examples were the Nazis, but there were many other examples of forced medical research going on all the way up until the 60s and 70s—the fundamental cornerstone of research ethics is that the free, voluntary, informed consent of the participant is absolutely central.
Altruistic kidney donation is a wonderful example of a positive thing, but I can guarantee that if you are planning to donate your kidney to your daughter or your spouse or whoever, you will have hours of discussion with surgeons and counsellors and psychologists to make sure you’re fully aware of what you’re doing and the balance of benefits and risks.
If you say, “Yes, I get all that, I know there’s a small risk of this and a small risk of that, and I’m prepared to go ahead,” then absolutely—God bless you.
The worry about DNA collection and so on is that that process really didn’t happen to anything like that degree of detail and scrutiny. Therefore, the possible repercussions of your decision—your altruistic decision—to be part of this, that’s the question: were people fully aware?
And does it matter? Because again, it comes back to this question of privacy. I’m not sure that the previous understanding of privacy is going to be possible anymore in an information‑surveillance age when basically everything about everybody is known. As soon as you get out your front door, you’re on facial recognition cameras; you’re being stored in databases; your mobile phone is tracking you.
Tim Wyatt: Every tap and swipe on your phone is being hoovered up.
John Wyatt: That previous vision of privacy—has that gone?
Tim Wyatt: Yes, that’s a great question. I think there’s something very different about the idea of giving informed consent to this kind of long‑running, forward‑facing medical research.
If I’m going in to donate my kidney, the surgeon can sit me down and say, “We’ve done these operations thousands of times. We know what the risks are. Here are the chances that something goes wrong. Here are the possible negative consequences if something goes wrong. Here are the chances of you developing kidney disease and discovering, too late, that you really could do with a second kidney.”
You can read the numbers and make an informed choice. But in 2006, we were 15‑plus years away from the invention of ChatGPT. We were 10 years away from most of social media existing. Nobody—not even the people setting up Biobank—could have any realistic ability to tell you, in 2006, “In 2026 this is what the world will look like, this is what information technology will be available, and these will be the risks of you taking part in the study.”
Right now, nobody can tell those volunteers what it will look like in 2056. Who has any idea what might be possible with data science and statistics and information technology and computers and AI at that point?
So in some sense, truly informed consent for the lifetime you have left—and the lifetime that you will spend as part of this Biobank research project—is impossible.
John Wyatt: It absolutely is. This is very relevant because there is still recruitment going on for neonatal biobanks: to take a newborn baby and ask parents to agree to donate their DNA so we can follow this child all the way from birth.
The argument goes—and again, there are arguments both ways. The baby can’t give consent; the baby doesn’t know. Should we be doing this? The argument the other way is: actually, this may benefit the baby. If the baby has a genetic disease, we can pick it up. And when the baby’s 18, they’ll be given the option of opting out and saying, “Your parents recruited you to this, but now it’s your choice whether you continue.”
Tim Wyatt: Can they delete all their records retrospectively?
John Wyatt: I don’t know the answer. That’s a good question. I don’t know.
Tim Wyatt: I suppose the other concern when it’s done with children is not just that they can’t consent, but also: what is this data going to be used for?
A lot of it, I guess, will go into commercial research to produce better quality polygenic screening scores. This is the stuff we’ve talked about before, where it’s about trying to figure out: if you have this combination of genes, you are more or less likely to develop these various diseases, or have this intelligence level, or be this good at X, Y, and Z.
This all ties into what we’ve talked about as modern, so‑called liberal eugenics. There are a lot of companies in an arms race to find ways of offering polygenic scores before you implant an IVF embryo, and effectively give parents the chance to pick their own baby. I, for one, want nothing to do with that. I think that is an incredibly unethical direction of travel.
Yet that’s obviously of huge interest to these companies: getting hold of large numbers of children’s DNA and then tracking them through their lives to find out what happens if you have that particular cocktail of DNA.
John Wyatt: Absolutely. Although, to be fair, you can still do that with the Biobank data as it is. Yes, it’s adults, but you’ve still got psychological scores on many of them, so you could still use that data to see the relationship between DNA and psychological traits.
Tim Wyatt: To be fair, Biobank says that the only people allowed to use their resource are those doing health research “in the public good”—that’s the language they use on their website. Whether a private company seeking to do liberal eugenics would qualify, I don’t know.
John Wyatt: Well again, the argument can be made either way: surely it’s beneficial to know that people are at risk of learning difficulties. Science is continually advancing from discovering different genetic variants and their implications. That knowledge can be used for good; it can also be used for evil.
This is the mystery, particularly in a broken world. It goes back to the beginning of Genesis and the Tree of the Knowledge of Good and Evil. Once we have taken the knowledge of good and evil in our fallen world, it can be used either way.
Tim Wyatt: I think there’s something quite profound about that story, and about the fact that the Christian tradition has always had this idea that knowledge is powerful, but not all knowledge is good for us. That is baked into the story in Genesis 3, and I think it recurs at various points across the biblical narrative: that actually, in God’s kindness, he made us to be finite and limited.
That’s the story of the Tower of Babel, isn’t it? Mankind trying to transcend its limits and finitude, and actually God says, “No, I’m not going to allow that to happen,” not just because it’s a threat to him, but because ultimately it’s not how we were made to be. We were made to be creatures, not to have this bird’s‑eye, God‑like perspective that he has.
John Wyatt: Yes, and the promise of the serpent in the garden is, “You can be as gods. You will not surely die.”
Tim Wyatt: “You will be like Him.” Yes.
John Wyatt: “You will be like Him.”
Tim Wyatt: I know there is a future in which we have this kind of 360‑degree genetic surveillance, where we have such a rich understanding of the genome that we can predict everything, that we know everything. Obviously I don’t really believe in biological determinism in that way anyway, and I think a lot of scientists don’t.
But there are some people who think: wouldn’t it be brilliant, once we’ve got such a deep understanding of genetics, that we can look at your genome and not just say, “You have a 5% increased chance of developing breast cancer,” but, “You will die at this point, and you will get these diseases, and you will be this intelligent, and this good at the violin, and this good at sport.”
Actually, I think: I don’t want to live in a world where we have that kind of self‑knowledge. I don’t think that’s desirable or good for us.
John Wyatt: Yes, it’s a fascinating theme, because the interesting thing is that in the Scriptures, particularly in the Old Testament, that kind of ability to foretell the future is seen as something profoundly occult and evil. This is soothsaying.
Speaker 5: Yes.
John Wyatt: And the people were forbidden from that. Don’t go there; don’t go to the mediums who mutter and will tell you what the future is. There’s always a difference between prophecy—godly prophecy—where God is, through his prophets, telling about the future, but in an overall broad‑brush way: “This is what God is planning to do,” often in very enigmatic form; versus the soothsayer who says, “You will die on the battlefield as a result of the traitor,” or whatever.
It’s interesting that predictive genetics, arguably, is much closer to soothsaying than it is to biblical prophecy. That means that in a fallen world, there is something that is not good for us to know.
That is totally against the Enlightenment view of humanity.
Tim Wyatt: This is obscurantism; this is the kind of thing scientists would deride us for. They’d say Christians want to keep people enslaved and in darkness, whereas actually knowledge is power.
The whole theory of the Enlightenment—what does it mean? It was an enlightening. We were blasting away the Dark Ages, so‑called, with light, with knowledge. It was the kind of superstitious medieval Catholic Church that had sent Europe into this obscurantist dark age by saying, “No, no, only this special priestly class are allowed to know things.”
That’s why you had things like prohibitions on doing autopsies. For a long time in the medieval period and early modern world, if you were a cutting‑edge researcher in the 1400s, you would do autopsies at night so the Catholic Church wouldn’t find out, because it thought it was wrong to look inside human bodies and figure out how they worked.
John Wyatt: I visited an original dissecting theatre in the University of Padua, I think it was, which was situated on a river and was completely blacked out. It was this huge space. Apparently, when there was a body, they were tipped off that it was coming down the river. All the medical students would be called to this dissecting theatre at night, in secret.
The body would be stopped on the river, brought up into the theatre and displayed. It was designed so that everybody could walk in single file down from the top and then actually inspect, at close quarters, the body, and then file back. So a hundred medical students could all be in this sealed room. That was entirely because it was illicit and illegal.
So yes, there is great truth there: we want there to be light; we want there to be transparency. That’s a Christian desire for truth. But perhaps there are some things that it is better for us not to know.
I don’t want to know when I’m going to die. I don’t want to know how I’m going to die. I don’t think it will help me in this final phase of my life. As we’ve discussed, I’m 73. I’m not sure that knowledge will help me. I’m perfectly happy to leave that, ultimately, in God’s hands.
Tim Wyatt: I think, as often, we land this with a somewhat nuanced position, which is that altruistic medical research is primarily a good thing, and we want to encourage people to be involved, Christians and non‑Christians alike. Yet it should be with very thoughtful, informed consent, and in some spheres that is difficult to achieve in our technological age.
The balance between privacy and knowledge, confidentiality and research, is an increasingly difficult one to strike. The truth will set you free, as Jesus says, and yet God chooses, in his wisdom and his mercy, not to give us absolute knowledge. It is potentially quite a big part of our Christian discipleship to know the wisdom of what we should seek to find out using our God‑given reason.
As we’ve talked about on this podcast, a lot of the development of the Western scientific tradition was led by people who were profound Christians, followers of Jesus, who saw that as inspiring them to better understand God’s creation. Knowledge is a good thing, and yet there might be some scientific discoveries that do not enable us to live healthy, fruitful lives as disciples of Christ.
John Wyatt: Particularly knowledge about the future. Isn’t it interesting that it is a deep theme throughout the Christian narrative that we walk by faith and not by sight, and that by definition you cannot know?
You got married and you had no knowledge what that marriage would lead to. You took a promise; you had no knowledge what that promise might lead to. You bring a child into the world and you have no knowledge of what joy or sorrow or tragedy that might lead to. And that’s good, and that’s proper, and that’s right, and that’s the way we are made.
Tim Wyatt: I think of those moments in the Gospels where the disciples say, “Can you tell us, Jesus, who’s going to sit at your right hand and left hand?” Or, “When will you restore the kingdom to Israel? When are you going to come back? When’s the consummation of all things?”
Jesus’ response is that even the Son himself doesn’t know that. Christians have spent thousands of years trying to figure out when is the Second Coming, when are the End Times, and yet it’s been clear from day one that God does not want us to know.
There are some things that are not good for us to know. I think Christian discipleship is about learning the humility to trust God, as you say, to walk by faith—not always by saying, “I need to see.”
“I need to see; I need to personally verify and ascertain full knowledge before I can put my trust in the one who does see all things.” Recently I was looking into the story of Thomas—Doubting Thomas—and about how he refuses to believe because he hasn’t seen Jesus, he hasn’t physically verified, he hasn’t conducted his “experiment” by putting his hand into the scars.
Jesus, in his grace, comes the second time and says, “Look, here I am, Thomas. You can see.” But what does he say? “Blessed are those who believe who have not seen.”
I think that is true about the resurrection—that’s true of all of us; we haven’t seen the resurrected Jesus—but it’s also true in a more macro sense about the Christian life: that there is blessing when we exercise faith, rather than insisting that first we need to personally see and verify and ascertain full knowledge before we can trust.
John Wyatt: Yes, and this is not faith against reason. It’s not faith believing things that are not true—that’s a caricature. It is faith based on trust. It’s the trust of a child who puts their hand into their father’s, not really being able to know what their father’s up to, but just trusting that he will have their best interests at heart.
Tim Wyatt: Exactly. Okay, well, I think we’ve come to an end. Thanks everyone for listening. I’m not sure I expected to land here when we started talking about the UK Biobank data leak, but I think these are some important questions to be thinking about.
We’d love to know your own thoughts about what we’ve been talking about today: medical research, confidentiality, the challenge of privacy in an internet age, altruism, knowledge—all those things. We’d love to know your own thoughts, or if you’ve got suggestions of things we might want to extend this conversation into: other stories from medical research, or from the rest of culture and society and technology that we need to be getting our teeth into.
Please get in touch with us by emailing molad@premier.org.uk, and you can, as always, find plenty more material on Dad’s website—that’s johnwyatt.com. There’s lots of Dad’s writing on medical research and ethics and the Bible. You can find some of my own journalism at my website, which is tswyatt.com.
Thanks for listening. We’ll be back next week with another episode, and until then, bye‑bye.

