This paper examines how the increasing use of social media and other digital transformations affect and challenge trust relations between science, media and society. In response to these challenges, the paper discusses the development of new tools for fact-checking and quality control of scientific information online. The authors urge political leaders to back and value scientific methods and standards of research integrity, and support digital innovations to overcome threats to public reasoning and scientific discourse.
While the widespread use of social media as a source of information might lead to a ‘context collapse’ of information, reinforce people’s confirmation biases and ultimately push the polarisation of societal groups through so-called ‘echo-chambers’ and ‘filter-bubbles’, the paper warns about a growing corporatisation of communication, a lack of funding for quality science journalism, (geo-) political computational propaganda and disinformation campaigns, as well as an increasingly polarised political climate.
All these trends have substantial consequences for the communication of science and might threaten the core pillars of trust in science as well as media: integrity, transparency, autonomy and accountability. The paper proposes to develop new mechanisms for researchers, journalists and other communicators of research to safeguard and reinforce these pillars and counter a loss of trust and trustworthiness.
Researchers “need to convincingly prove that a free and just society means a society in which all people are equal, but not all expressions are equally true. It is a society in which everyone should have unrestricted access to data and information, but also the opportunity and civic duty to acquire the skills needed to evaluate knowledge claims. This is why it is crucial to reflect on how we can effectively organise and defend a democratic digital society in which trust in expertise is anchored in longstanding and well-established standards – but wrapped in new mechanisms.”
The paper considers that researchers “need to become even more transparent, more ‘observable’, and more public than before”, engage in online debates regarding their field of expertise and “guide non-experts by systematically deconstructing and refuting deceitful stories and outright fabrications”. “Automated tools for fact-checking, flagging, online linking and referencing have to be developed and carefully tested in order to help citizens identify quality information”, the authors argue.
However, as the paper concludes, there are limits to what a good-hearted and motivated scientific community can do to overcome the identified obstacles by merely improving its (digital) communication. Without a supportive political backing that values scientific methods and standards of research integrity, and effectively protects science and society from the threats identified in this paper, “all well-meaning efforts might come to naught and look like bringing origami flowers to a machine-gun fight.”