HWC How Certain Scientists Are Regarding Climate Change Question
Description
from NCAR’s Uncertainty and the Nature of Science
Uncertainty differs in public and private discourse. To many, uncertainty means not knowing. To scientists, uncertainty means how well something is known.
In the video below, NCAR senior scientist Linda Mearns, talks about scientific uncertainty and the fact that uncertainty is a given and familiar to us all. What we need to ask and quantify in the face of uncertainty is … when do we know enough to act.
https://www.youtube.com/watch?v=IEkle7zh8Ys
Why is uncertainty a part of science? How can we make sense of uncertainty?
From Sense about Science’s Making Sense of Uncertainty
Scientific uncertainty is prominent in research that has big implications for our society: could the Arctic be ice-free in summer by 2080? Will a new cancer drug be worth its side effects? Is this strain of flu going to be a dangerous epidemic?
Uncertainty is normal currency in scientific research. Research goes on because we dont know everything. Researchers then have to estimate how much of the picture is known and how confident we can all be that their findings tell us whats happening or whats going to happen. This is uncertainty.
But in public discussion scientific uncertainty is presented as a deficiency of research. We want (even expect) certainty safety, effective public policies, useful public expenditure. Uncertainty is seen as worrying, and even a reason to be cynical about scientific research particularly on subjects such as climate science, the threat of disease or the prediction of natural disasters. In some discussions, uncertainty is taken by commentators to mean that anything could be true, including things that are highly unlikely or discredited, or that nothing is known.
Some clearer ideas about what researchers mean by scientific uncertainty and where uncertainty can be measured and where it cant would help everyone with how to respond to the uncertainty in evidence.
In the areas of research that are most often in the public eye, uncertainty has become a big point of misunderstanding (even conflict) between scientists and commentators. A researcher presents his or her findings, the radio interviewer (or the politician, journalist or official) asks: can you be certain?. The researcher has to answer truthfully ‘no’ and then defend their findings, for fear they will be interpreted as meaningless. In fact, they have provided important limits to the uncertainty.
Researchers use uncertainty to express how confident they are about results, to indicate what scientists dont yet know, or to characterize information that is by nature never black and white. But saying that something is uncertain in everyday language has a negative connotation. When a researcher says the predictions we made on the basis of our research have a margin of uncertainty, they mean they are very confident that the outcome will fall within the predicted range. But a commentator is likely to understand from this the piece of research is unreliable.
This is the type of disconnection we see in media reports of global warming.
Read the article “https://senseaboutscience.org/wp-content/uploads/2016/11/Makingsenseofuncertainty.pdf” to get a better understanding of uncertainty in the sciences (Find also in Read).
Uncertainty and Climate Science
From the IPCC’s “Guidance Note for Lead Authors of the IPCC Fifth Assessment Report on Consistent Treatment of Uncertainties” (Find in Read)
The evolving nature of climate science, the long time scales involved, and the difficulties of predicting human impacts on and responses to climate change mean that many of the results presented in IPCC assessment reports have inherently uncertain components. To inform policy decisions properly, it is important for uncertainties to be characterized and communicated clearly and coherently. Since its second assessment, the IPCC has issued formal guidance for characterizing and communicating uncertainty in its reports. The guidance is intended to provide a common language for expressing confidence in the conclusions and in the likelihood that a particular event will occur.
Level of Confidence
Figure 1 explains the basis of confidence in terms of level of evidence and degree of agreement. They used the following dimensions to evaluate the validity of a finding: the type, amount, quality, and consistency of evidence (summary terms: limited, medium, or robust), and the degree of agreement (summary terms: low, medium, or high). Generally, evidence is most robust when there are multiple, consistent independent lines of high-quality evidence.
See attached document for table.
Figure 1. A depiction of evidence and agreement statements and their relationship to confidence. The nine possible combinations of summary terms for evidence and agreement are shown, along with their flexible relationship to confidence. In most cases, evidence is most robust when there are multiple, consistent independent lines of high-quality evidence. Confidence generally increases towards the top-right corner as suggested by the increasing strength of shading.
A level of confidence provides a qualitative synthesis of an author teams judgment about the validity of a finding; it integrates the evaluation of evidence and agreement in one metric. As the second step in determining the degree of certainty in a key finding, the author team decides whether there is sufficient evidence and agreement to evaluate confidence. This task is relatively simple when evidence is robust and/or agreement is high. For other combinations of evidence and agreement, the author team should evaluate confidence whenever possible. For example, even if evidence is limited, it may be possible to evaluate confidence if agreement is high. Evidence and agreement may not be sufficient to evaluate confidence in all cases, particularly when evidence is limited and agreement is low. In such cases, the author team instead presents the assigned summary terms as part of the key finding.
The qualifiers used to express a level of confidence are very low, low, medium, high, and very high.
Likelihood of an Outcome
Likelihood, as defined in Table 1, provides calibrated language for describing quantified uncertainty. It can be used to express a probabilistic estimate of the occurrence of a single event or of an outcome (e.g., a climate parameter, observed trend, or projected change lying in a given range).
The IPCC Authors characterized key findings regarding a variable using calibrated uncertainty language that conveys the most information to the reader, based on the criteria below (See Table 1). These criteria provided guidance for selecting among different alternatives for presenting uncertainty, while recognizing that in all cases it is important to include a traceable account of relevant evidence and agreement.
Table 1.Likelihood Scale (see attached document)
Read the https://www.ipcc.ch/site/assets/uploads/2018/05/uncertainty-guidance-note.pdf published by the IPCC regarding the treatment of Uncertainties.
Have a similar assignment? "Place an order for your assignment and have exceptional work written by our team of experts, guaranteeing you A results."