In a knowledge society, the more knowledge there is available, the better. However, Ralph Hertwig, Director at the Max Planck Institute for Human Development, also considers deliberate ignorance to be useful and, sometimes, even necessary. In a digital world of information overload, he also pleads for conscious critical ignoring. Smart ignorance, alongside critical thinking, can even serve as a didactic concept.
Mr Hertwig, there is a picture hanging on the wall behind you. It shows an ostrich with its head in the sand. The so-called ostrich policy has fundamentally negative connotations. But I assume you see it differently.
The ostrich is not the only one burying its head in the sand in the picture. Behind it there are people doing the same thing. I discovered the picture by chance and found it appropriate and amusing because it is connected with the theme of “deliberate ignorance”. And yes, you are right, burying your head in the sand has very negative connotations. But it can also be interpreted differently. It can also mean: I am protecting myself from certain information. This does not have to be wrong, not even in an ethical sense.
Now we may have a clear preference for ignorance not only when it comes to the time of our death but also in other areas of life.
You deal with deliberate ignorance scientifically. How did that come about?
It was while reading a novel, unfortunately I can’t remember which one. In any case, in it I stumbled across the question of whether a person would like to know on which day they will die. “Of course not!” I thought spontaneously. I then asked around. The vast majority of people I asked had the same spontaneous and entirely unambiguous reaction as I did. Now we may have a clear preference for ignorance not only when it comes to the time of our death but also in other areas of life. I found this a very interesting topic.
Which areas of life might these be?
There are, for example, a lot of people who no longer watch the news because it makes them depressed and distressed. Other people do not want to think too much about how our consumer goods are produced. A completely different context is so-called predictive genetic testing. Today we know a lot about the diseases which also have a genetic component. But do we want to know if we maybe have a health handicap or increased risk in us, or would we rather not know in certain cases, such as dementia?
Using the Stasi files, we managed to conduct a historically unique field experiment.
You also studied how people deal with information from Stasi files. What did this research reveal?
Using the Stasi files, we managed to conduct a historically unique field experiment. When many millions of files became accessible, anyone could apply to look at them. In the context of a transformation society like the former GDR, it is interesting to see how people deal with information about the defunct repressive apparatus: do I want to know who was a perpetrator, who was a victim and who might have pulled the strings in my life without my knowledge? Deciding what you want to know and what you don’t want to know is also connected with how you successfully create a social transformation. How can we avoid getting caught in a cycle of resentment or even revenge? On the other hand: don’t we have to face up to the past to make sure we don’t repeat the same mistakes? This does not only apply to the former GDR but also to many other states around the world that have emerged from authoritarian systems.
In the context of news or the production of consumer goods, we could also speak of repression. But what reasons do people have for not looking at their Stasi files?
There are many different motives for not wanting to know something. A very important motive – and this can be seen in many other areas of life – is what we call emotion regulation. This also plays a role with the Stasi files. Very strong negative emotions can be associated with certain knowledge. Close relatives or close confidants may appear in a file as informal collaborators. The question now is: if we assume we might make such a discovery in the file, do we want to deal with these emotions – for example, anger, disappointment, loss of trust or grief – or do we prefer to regulate them by choosing not to look into the file?
Another motive is the supposed claim of truth attributed to the files. From today’s perspective, it is often no longer possible to judge whether this file actually contains true information or what the true background of a behaviour was. For example, it is unclear under what circumstances the Stasi people created it – or whether the informal collaborators, who may have been or still are my friends, were forced to provide information by the Stasi. Since we do not know all these circumstances, we cannot come to a fair judgement. Therefore, as some of the respondents in our interviews argued, it is better to not know the contents of the file in the first place.
So there are very different motives when we decide to not want to know something.
Okay. But it’s actually true that the more you know, the better you can decide.
Not necessarily and in every case. There is also deliberate ignorance in institutions. A wonderful example is the increased proportion of women in the world’s great symphony orchestras. This is partly due to the fact that the orchestras have introduced so-called blind auditions when admitting new musicians. This means that the candidate’s audition takes place behind a screen. In this way, the jury can protect itself from its own potential prejudices. Information about the appearance or gender of the person auditioning has no influence. The judgement then only concerns the person’s musical performance. The jury comes to a fairer assessment.
So there are very different motives when we decide to not want to know something. Sometimes there are also combinations of motives. It is not easy to judge whether this deliberate ignorance is justified, is ethically or morally right or not.
Knowledge has an independent psychological quality and reality which, under certain circumstances, also has its psychological costs.
So we protect ourselves from prejudice, from hurting feelings – our own, those of others. This means that knowledge has incredibly strong emotional and psychological potential.
This is true, although, from an economically rational perspective, deliberate ignorance is difficult to understand at first. In the concept of homo economicus, knowledge is basically equivalent to money. If you had too much of it, you could just give it away, donate it, spend it or take it to the bank. Looked at this way, more knowledge is always better than less knowledge. However, this requires the knowledge that I have gained – but want to get rid of again – to then no longer have any influence on my thoughts and emotions. In economics, it is assumed that knowledge only ever has an instrumental value. This is obviously an incorrect simplification. For example, first of all if I learned that I had a high risk of developing Alzheimer’s or breast cancer, it would be difficult to put this knowledge aside and ignore it from now on. Knowledge has an independent psychological quality and reality which, under certain circumstances, also has its psychological costs.
Does deliberate ignorance also exist in science?
Yes. In genetics, specifically with regard to human cloning, or in the development of the atomic or hydrogen bomb, scientists have asked themselves whether it is ethically justified to put certain knowledge into the world. But this is not the only context in which deliberate ignorance has an effect in science. Think, for example, of institutional norms and agreements. Deliberate ignorance appears very often here, for example in the assessment system. When articles or funding applications are assessed, they are often anonymised. The jury then does not know who submitted the application, and those being assessed do not know who is evaluating them. The aim here is always to make the fairest possible judgement.
In some areas, not doing research is connected with ethical issues. This means that the cultural context also plays a role. In China, there are perhaps fewer inhibitions about promoting cloning than in Europe.
I find the question really fascinating. I don’t have any data on this yet. But I could easily imagine that the areas in which we consciously do not want to know something also differ culturally. This probably also concerns how we deal with the topic of death. Depending on which emotions, including culturally and historically influenced ones, are triggered by thinking about our own death, this may also lead to different preferences regarding the desire to know the time of our death. I could also easily imagine that there are significant cultural differences in the area of sexuality, where there is certainly also deliberate ignorance.
Let’s talk about science again. Isn’t deliberate ignorance also a way of reducing complexity?
Every scientific model is a reduction of the complexity of reality. And in the process of modelling an issue, we make decisions not to include things. If we did not do this, the models might no longer be manageable. Highly complex models have the disadvantage that we can possibly explain everything with them. But if we can explain everything, we can basically explain nothing.
Does deliberate ignorance also have a place in education?
This is a question that we – a philosopher, two cognitive scientists and an educationalist – have just been dealing with. We worked with the pair of terms “critical thinking” and “critical ignoring”. Critical thinking is, so to speak, the supreme discipline at school. We want to educate and enable citizens to think critically. And that is absolutely right.
But there is also a modern world that we have constructed ourselves, where critical thinking sometimes also leads into a trap. The digital world means we are confronted with a great deal of information and also misinformation. This world is based on a single business model – to attract our attention and then sell this attention as a commodity. If we followed every lead and tried to critically think through every attention-seeking piece of information we are presented with, we would be walking right into the trap of this attention market. Therefore, in this environment, we also have to learn and apply a different strategy, something like critical ignoring.
What exactly do you mean by that?
In our work we have focused on three strategies. The first we would call “self-nudging”. By this we mean that in the digital world, people themselves must become the architects of their environment. This means, for example, that I have to think carefully about which media apps I put on my home screen or how I design the notification function of my social media or email program. So I myself can regulate the flood of information and the external control by the programs by actively designing the decision architecture, as I call it, in such a way that I have the greatest possible control over what information I take in and what I ignore.
Another strategy describes what is called “lateral reading” in pedagogy. We are constantly confronted with websites where we do not know how trustworthy they are. Instead of jumping straight into the content of a page, however respectable it might look, when I read laterally, I first look for information about the actual origin of this website. Because even websites with misinformation often look very respectable. Lateral reading is a better strategy than critical thinking here when deciding whether or not to pay attention to a page.
A third strategy is what we call “ignoring the troll”. Trolls not only spread disinformation, they can also trigger emotionally painful and very stressful experiences. It is often a better strategy to ignore such messages and block the sites.
There are no doubt more strategies. But this was our first attempt to describe critical ignoring in terms of clear mental strategies.
Can deliberate ignorance also be a kind of didactic tool under certain circumstances?
That is an interesting question. I find deliberate ignorance so fascinating because it gives us a new understanding of what knowledge actually is, what meaning, what psychological and emotional reality and power it can unfold. We also gain better awareness of the role knowledge plays in our society and in our history. Is our history really a history of constantly increased knowledge or are there also moments when we decided we would rather not know certain things? In this respect, I find reflecting on deliberate ignorance an interesting didactic tool to stimulate us intellectually, sometimes provoke us and make us think.
As a rule, however, we are urged to constantly accumulate new knowledge or always learn something new. But there are also limits to this. Is there a correlation between deliberate ignorance and forgetting?
There is a phenomenological similarity. With functional forgetting, we remove things from our psychological present that we may no longer be able to do anything with because knowledge can also lose its value. Or we erase memories that are an emotional burden for us. Post-traumatic disorders often arise because we cannot forget things.
We could say that forgetting leads to unnecessary or no longer current knowledge leaving the cognitive system, or at least no longer competing with current and helpful knowledge. Deliberate ignorance means that certain knowledge does not enter the system in the first place. So we could speak of two sides of the same coin.
So we have to do the quality control ourselves, but we can only do it to a limited extent.
In a world which also includes political manipulation, the strategies you describe are likely to have a role in political education, too.
Absolutely. 20 or 30 years ago, most of us relied on the quality of certain gate-keepers, traditional media for example. Today, it is no longer these media alone that produce information, potentially everyone who uses social media can. In addition, there are all kinds of agents, artificial or human, who may spread information to manipulate us and our opinions.
This means that, as consumers of information, we need to be equipped with new skills to deal with, on the one hand, the sheer flood of news and, on the other hand, with what the World Health Organisation has called an “infodemic” in the context of the Corona pandemic, meaning the flood of misinformation and disinformation for strategic purposes. So we have to do the quality control ourselves, but we can only do it to a limited extent. That is why we must also learn to design the architecture of our own media and digital world. For example, do we allow our browser to endlessly collect information about us and bombard us with advertisements or do we activate ad blockers and “private browsing”?
But people who believe in conspiracy theories also create their own media world. These call traditional media a lying press and only trust the sites that reflect their own opinions.
You are right about that. I would also not argue that critical ignoring is a strategy that gets someone out of their conspiracy theory corner. Critical ignoring or deliberate ignorance can be a good strategy for all those who do not want to get caught between the millstones of different platforms and strategic interests. But it is not a solution for people who have opted out of fact-based discourse.
So critical ignoring would have to be applied before someone slides into conspiracy theories.
That is how I would see it. And that is why, in my eyes, critical ignoring should play a greater role as an important cultural skill in school education and adult education as well.