Don’t take zero for an answer
February 25, 2016 by Bo Weidema
I often encounter the idea that some numbers are just too uncertain to use and that it would be better not to quantify the issue and describe it qualitatively instead. Earlier this month I was presented with another example: At the first meeting of the ISO working group on monetary valuation, one of the environmental economists present used the effects of electromagnetic fields near high voltage transmission lines as an example of an impact that should not be quantified because the knowledge was just too uncertain.
I claimed that uncertainties cannot and should not be an argument for not quantifying. Even when describing something qualitatively instead – placing a flag as some people say – the most likely consequence is that decision makers will leave the issue out of consideration altogether – or maybe even worse: the missing quantification may be used by advocates for special interests to argue that the issue is important – even though a simple quantification of the worst case would have shown that it cannot be.
Another often-stated argument against using uncertain numbers is that decision makers cannot handle uncertainties. But this is simply not true. Uncertainties are a certain part of life – we live with them everyday. As Lise Laurin, CEO of EarthShift, put it in a recent discussion on PRé’s LCA discussion list:
‘We all want to know if it is going to rain on the day of our picnic. But what we get is a probability. If the forecaster says there is a 30% chance of rain, we may go ahead and schedule the picnic but have contingency plans just in case. The only thing we need to know about the process of forecasting the weather is that it’s really complicated’.
And, yes, quantifying environmental effects include a lot of complicated issues – and some of these issues will certainly have high uncertainties. Sometimes our data are just not that accurate, either because the methodology is in its infancy or because no one has bothered to do that thorough study yet. I am a firm believer in presenting data with their inherent uncertainties and not exclude any part of the inventory just because it is complicated.
Our role and duty towards the policy- and decision-makers is not to decide that some impact or other have to be excluded, because it has a high uncertainty – but perhaps we need to put more emphasis on educating our customers and stakeholders to ask for completeness and uncertainty in the information they are given – and not to take zero for an answer.