Sci&Tech writer Emily Adams examines the pros and cons of COVID-19 censorship, and argues for an approach focussed on education
At what point does knowledge become fact? There was a time when the greatest minds on our planet agreed that the Earth was flat, or that leeches were a valid medical tool. A precondition for scientific integrity and development is an informed dispute of ideas.
But what happens when evidence-based debate is substituted for political and social agendas? In other words, can opinions deviating from scientific consensus become dangerous? This is a key question facing governments and scientists, magnified by the COVID-19 pandemic. Current debate questions whether institutions have the right to suppress dissidents from the scientific status quo, particularly if governing bodies believe lives are at risk.
This topic has hit the headlines in recent months as controversial figures voice varying degrees of vaccine scepticism. Alongside this, governments have resorted to extreme measures, with Austria enforcing mandatory vaccination in January of this year. Balancing freedom of speech with reducing deaths from COVID-19, governments are faced with an ethical dilemma. However, the power of censorship no longer resides solely in the hands of the state.
Social media platforms allow us to debate and share information, views and emotions. However, this rapidly expanding space with limited regulation can blur the lines between the domains of fact and opinion. It is digital spaces that are now faced with issues of censorship over vaccine misinformation. Do these private companies have a right, or a responsibility, to shut down potentially harmful content?
The Institute of Strategic Dialogue (ISD) think tank monitors extremism, and has demonstrated that a few individuals can have a ‘disproportionate influence on the public debate’ using social media. Misinformation with harmful consequences has already spread this way. Content connecting COVID-19 to the installation of 5G towers has resulted in violent threats to engineers. This highlights the dangerous impact of pseudo-scientific conspiracy, though the matter of how people medicalise their bodies arguably follows a different trend.
The Royal Society has recently urged social media companies not to remove ‘legal but harmful’ content. Instead, they advise stemming the flow of vaccine misinformation by altering algorithms so the content is more difficult to access and share. For example, preventing certain posts from showing up automatically on people’s feeds.
Whereas people have the right to express their opinions, some argue they do not have an automatic right to such a widespread audience who could be harmed by consuming certain views. Prof Gina Neff, a social scientist at the Oxford Internet Institute, agrees that using algorithms ‘[ensures] that people still can speak their mind’ but denies a wide reach of influence.
The Centre for Countering Digital Hate (CCDH) takes an alternative position, advocating a removal of incorrect content on the basis that it can cause harm.
However, suppression of dissident views not only raises ethical questions, but runs the risk of exacerbating the issue. Austrian research last year suggests mistrust in the COVID-19 vaccine correlated with mistrust in authority. Director of the Reuters Institute for the Study of Journalism at the University of Oxford, Professor Rasmus Klies Nielsen, concurs: ‘I imagine that there are quite a lot of citizens who would have their worst suspicions confirmed about how society works, if established institutions took a much more hands-on role in limiting people’s access to information.’
However, not all vaccine scepticism lies at the extreme end of the spectrum. Suppression of all dissident content could potentially suppress legitimate concerns from well-studied academics. These concerns are necessary to drive forward and improve scientific understanding. If direct censorship is implemented, social media platforms must be careful in drawing the line between dangerous, incorrect content and legitimate scientific concern.
Tackling vaccine misinformation is essential to protect public health, but reasonable scepticism runs the risk of being caught in the crossfire. Providing the public with tools to identify incorrect content and analyse data retains individual liberties and curbs a reactionary movement. The best way to tackle scientific misinformation is education.
Check out these stories below for more from Sci&Tech: