The Precautionary Principle is essentially that risks and opportunities of scientific developments should be understood before such developments are given the go ahead (see my comments on Computer Based Trading). The first article makes a distinction between "risks", probabilities that can be accurately quantified, from "uncertainties", probabilities that cannot be quantified This is Frank Knight's distinction from economics that develops de Morgan's definition of a "risk" as being an error in a scientific measurement. As the conventional definition of a "risk" is the possibility of a loss, mathematicians often prefer "chance" to represent a quantifiable uncertainty.
The article goes on to make the comment
Under uncertainty, then, it is not merely difficult in practice to calculate some single definitive "sound scientific", "evidence based" solution. The point is, it is irrational even to try, let alone claim, this.Let me try and elaborate how I interpret this statement from the perspective of finance. Ole Peters, a physicist, has recently produced a series of papers arguing that the standard approach to probability employed in physics, taking ensemble averages, fails when we are confronted with financial problems, and we need to take time averages. Peters arguments are rooted in the famous Petersburg Paradox which was central in eighteenth century discussions of probability in the context of an investment decision. At the end of the eighteenth century by considering the game it was realised that the conventional approach to probability (based on counting relative frequencies) was only applicable if experiments were repeatable. If a player, when presented with "Game Over" does not have the option to "Play again", much of statistical theory becomes irrelevant. This point had appeared in Pascal's Wager, which is still relevant today in that the player is confronted with an unrepeatable choice which offers an infinite loss, as does fiddling with nature. When presented with such a choice the player must play safe.
The point from this first article is that while physics often sees uncertainty as a lack of information in a complex, but deterministic system, economists have, for at least a hundred yeas, been thinking that there is something much more significant in the problem of uncertainty. In particular, post-Keynesian economists place the issue of "non-ergodicity", that the basic parameters governing the determinism universe could change in time (i.e. the parameter G in physics is not a constant),
The second article argues that the precautionary principle is a blunt instrument that will prevent anything good happening and is "a 90s throwback out of place in an era of "smart solutions" and big data". This implies that "back" in the 90s science was not capable of handling uncertainty, but now with "big data" it is. This second article reads as if there is no problem with Determinism, just with imperfect humans who need science to unlock the hidden mysteries of the universe and by being cautious human progression to enlightenment is delayed.
The problems of science and technology are embedded in Western culture, older than science itself appearing in Hesiod's Works and Days, the first European text on economics, predating Thales by a couple of centuries. Works and Days includes the story of Pandora, which starts with the theft from Zeus of the secret of ﬁre and from Hephaestus and Athena the crafts, by the Titan, Prometheus (‘forethought’). Prometheus had created mankind and passed these secrets on to humans. Zeus, not satisﬁed with punishing Prometheus for the theft by chaining him to a rock and having his liver eaten each day, decided to punish the mortals as well. He asked Hephaestus to create a the ﬁrst woman, Pandora, ‘all gifts’ who was given such characteristics as beauty by Aphrodite, cunning by Hermes and the skill to spin thread by Athena. Zeus ensured she was also lazy and foolish.
Pandora was sent to Earth to seduce Prometheus’s dim-witted brother, Epimetheus (‘afterthought’), with a single possession a jar which had been given to her under the strict instruction never to open it. However, Pandora had been given the gift of curiosity, by Zeus’s long-suﬀering wife Hera, and so one day she opened the jar, which contained all the evils that aﬄict mankind; disease, strife, war and the need to work, and these all escaped into the world. Realising her mistake, Pandora put the lid on the jar, trapping the last thing left in there – Hope.
My concerns with this article are numerous. Firstly it seems to suggest that concerns with scientific developments are a modern aberration, which ignores the significance of the Pandora myth. Secondly data, including "big data", is barren, it only has meaning in relation to a model, and models are invented by fallible humans. With the best intentions in the world it is possible to apply data to a model, erroneously believed to be correct, and come to the wrong conclusion. A notable example is the recent financial crisis, but science and technology is littered with engineering failures since the Financial Crisis of 2007, for example we have had Fukishima and Deep Water Horizon.
These ideas were touched upon in the third article, which looks to replace the precautionary principle with a less onerous "proactionary principle"
The proactionary principle valorises calculated risk-taking as essential to human progress, where the capacity for progress is taken to define us as a species.What strikes me is the implicit assumption that the uncertainties of the future is an episemological problem (relating to the bounds of knowledge) rather than an ontological problem (whether an answer actually exists). This distinction is fundamental to the difference between the economists Knight and Keynes.
While I was working in the oil exploration business, confronting the practicalities of balancing risks and opportunities resulting from an uncertain world, Alex Kacelnik presented the following game. Consider a small bird in a British who must find 9 worms in a six hour day to survive a cold winter's night. If the bird stays in its current location it has a 50:50 chance of finding 1 or 2 worms in an hour, alternatively the bird could fly to another field, using the the energy gained by eating a worm, and there is a 1 in 6 chance of finding 10 worms in the new field, but a 5 in 6 chance of only finding a single worm in the new field. The first strategy is less "variable" and is conventionally regarded as less "risky", the second strategy is more "risky" and orthodox investment theory (as followed by the firm I was working for) would argue against it as a strategy.
Consider a bird who has another hour to search for worms but had only found 6 in the preceding 5 hours. There is no way that the bird can survive by "playing safe", its only rational choice is to hope to get lucky on the risky strategy. Similarly if a bird takes thae risk in the first our and gets the 9 worms it needs, it can carry on taking a risk all day, potentially building up its breeding reserves with the sure knowledge that it will survive the night. It turns out that, in general, it is beneficial for both the rich and poor (in resources) to take risks, it is the middle classes who are muddling through precariously that should, rationally, play safe.
The essential point is that it is that risk preferences are elusive and it is impossible to rationally identify the optimal choice when the future is unknown and there is the risk of death/extinction. For someone with "no future", taking a risky choice is rational, for someone who might become slightly better off by taking the gamble, it is absurd. The problem for those advocating the proactionary principle is that it is the risk-averse middle classes who, through their taxes, fund the majority of research, and while scientists might be able to produce a number telling them that "all is safe", people are quite capable in arriving at their intuitive assessment of risks.
The proactioanry principle, with its belief in the ability of well trained scientists to come to the right conclusion, was precisely the principle that lead to the financial crisis. One of my favourite characters of the crisis was "X" who was rolled out by Goldman Sachs in August 2007 to explain some losses the bank had suffered as the Credit Crisis emerged (the explanation was reported by the FT and commented on here - open access comment). "X" had invested around $500,000 in an Ivy League degree, followed by a Masters in Statistics from Columbia (not something to be sniffed at) and then a PhD in Finance from Chicago. One might imagine that the sophisticated Goldman Sachs had gone through the ten step programme of the Proactionary Principle (or similar), never the less they got it spectacularly wrong.
If you accept that there exists absolute Truth (either in a Platonic sense, that Reality exists independent of human thought, or a Formal sense, that some statements are tautologies) then you will not worry about Precaution: with a little effort Truth can be identified and progress can proceed from there. The final article in the series observes that, as with a Teddy Bear's picnic, "you're sure of a big surprise" when dealing with technology: we cannot be sure of Truth. This final piece summarises the earlier articles and mentions democracy, but without explaining the relevance of democracy to the scientific process. My view is that mathematical probability is a rhetorical device developed in order to help in consensus setting in the presence of uncertainty.
When faced with uncertainty society needs to come to some form of consensus as how to proceed. Preventing the development of disruptive technologies is potentially harmful to the rich and poor, but disruptive technologies will potentially harm the middle classes. There is no rationally defensible solution to this dilemma, and so society needs to agree a consensus that the majority "buy into". Mauss argued that if uncertainty was addressed in the open, science and religion emerged, if it was done in secret, magic evolved. Scientists should be careful slip into magic and possibly end up burnt.