News headlines almost always deal in data-free absolutes. Take this recent strapline from an item on Australian news site: Drinking two or more colas a day – whether sweetened with sugar or an artificial sweetener – doubles your risk of chronic kidney disease, according to new research. And, at the time of writing, the media is full of the news that modern antidepressants don’t work, although the actual research papers on which such headlines are based will not be quite so definitive in their conclusions.
It is fairly typical of the many health stories that cross the news wires on a daily basis, if it isn’t artificial sweeteners, sugars, fat, and cholesterol, then it’s organic pollutants, prescription drugs, and electromagnetic radiation. The headlines seem inevitably to contrast starkly with the output of government and industry that seeks to quash our fears and to emphasise how doubling a tiny, tiny risk is no big deal.
It is not just health scares that are problematic. Similarly, we are repeatedly warned of the dangers of this or that behaviour, the effects on our lives of multifaceted issues, such as pollution, the changing climate.
It is difficult to disentangle the cause from the effect. Is the growing number of scare stories feeding a healthy public scepticism of technology or does it simply feed on a reluctance to trust technical expertise and science? It’s a problem for those in science and technology who face repeated impediments to their work that are more often than not based on unfounded qualms and misrepresented statistics.
There have been industrial accidents, of course, humans are indeed exposed to new chemicals, ecological systems do get harmed, and there are uncertainties about biological advances such as genetically modified organisms and nuclear power.
Writing in the International Journal of Global Environmental Issues (2008, 8, 132-146) social scientist Elisabeth Graffy and civil engineer Nathaniel Booth of the US Geological Survey, in Middleton, Wisconsin, argue that it should be possible to instil the public with renewed confidence in the validity of risk assessment as well as improve the outcomes of risk assessment. They suggest that this will require a reframing of risk research and communication so that scientific knowledge evolves in parallel with public understanding rather than the two being entirely disjointed. While the approach they propose is not likely to eliminate the type of see-saw headlines we see every day, it could have an effect on at least some issues.
Graffy and Booth report on a web-based platform that was developed to link experts and public discourse through shared information resources. Such a site “could simultaneously foster greater public awareness of the links between environmental and human health vulnerabilities, advances in scientific evaluation and assessment,” they explain, and lead to improved communication between the two. The prototype system, they add, was well received by scientists and public alike.
However, the researchers concede that there is much room for improvement in their web-based approach, if the goal is to reach the general public. In their experiment, it was the policy makers at different levels who got the most from it whereas the impact on general public users was mixed. The research of communications professors Craig Trumbo of Colorado State University, in Fort Collins, Colorado, and Katherine McComas of Cornell University, in Ithaca, New York, reported in the same issue of the journal (2008, 8, 61-76) point out that there are many intrinsic difficulties in engaging the public with sci-tech matters, particularly when health or environmental risks are involved.
They have looked at how public trust of institutions, whether of governments, companies or other organisations affects the way in which people process information and perceive risk. Their data are based on US state health department investigations into 30 suspected cancer clusters. The researchers assessed trust for three information sources: state health departments, civic groups and the industries involved in each case.
Perhaps unsurprisingly, Trumbo and McComas found that when people have trust in the state health department’s information their perception of risk is lower, while trust of civic groups, often activist groups, is associated with the perception of greater risk.
But, the situation is not quite that clear cut. “The manner by which these associations may form is linked to the way that people process the information, ” Trumbo told Sciencebase, “Trust associated with civic groups, for example, is aligned with having to use a systematic strategy for information processing. This is a more effortful and involved processing form. Conversely, trust associated with industry or state sources is aligned with stronger heuristic information processing. This is a less effortful and more “rule of thumb” manner of thinking.”
“Trust is a precious commodity for those communicating risk and perhaps acutely so for industry and governmental risk communicators who are typically considered ‘less trustworthy’ sources,” the researchers say.
It could be that the underlying problem, the disparity between the purported trustworthiness of civic, state and industry sources involves public understanding of science, the scientific process, and what is scientific evidence and what is not. But it is also recognized that science itself is not a value-free enterprise and that in many contexts, especially cancer clusters, there is a high degree of uncertainty even for the scientists. In these contexts it is unlikely that there is much to be gained by making the assumption that if we might only educate people about science they will arrive at the correct opinion about risk. These contexts call for a balanced approach to risk communication that recognizes the key difference between data-free absolutes, pseudoscientific opinion, legitimate value-based non-expert reactions, and the role of a scientific approach in risk assessment, Trumbo explains.
In a follow-up article, I will be discussing my experience of asking for feedback on the issue of public trust in science on the business networking site LinkedIn. Watch this space, or better still subscribe to the Sciencebase newsfeed and ensure you don’t miss it. My item on Trust is now scheduled to post March 24, 2008.