The Precautionary Principle is essentially that risks and opportunities of scientific developments should be understood before such developments are given the go ahead (see my comments on Computer Based Trading). The first article makes a distinction between "risks", probabilities that can be accurately quantified, from "uncertainties", probabilities that cannot be quantified This is Frank Knight's distinction from economics that develops de Morgan's definition of a "risk" as being an error in a scientific measurement. As the conventional definition of a "risk" is the possibility of a loss, mathematicians often prefer "chance" to represent a quantifiable uncertainty.
The article goes on to make the comment
Under uncertainty, then, it is not merely difficult in practice to calculate some single definitive "sound scientific", "evidence based" solution. The point is, it is irrational even to try, let alone claim, this.Let me try and elaborate how I interpret this statement from the perspective of finance. Ole Peters, a physicist, has recently produced a series of papers arguing that the standard approach to probability employed in physics, taking ensemble averages, fails when we are confronted with financial problems, and we need to take time averages. Peters arguments are rooted in the famous Petersburg Paradox which was central in eighteenth century discussions of probability in the context of an investment decision. At the end of the eighteenth century by considering the game it was realised that the conventional approach to probability (based on counting relative frequencies) was only applicable if experiments were repeatable. If a player, when presented with "Game Over" does not have the option to "Play again", much of statistical theory becomes irrelevant. This point had appeared in Pascal's Wager, which is still relevant today in that the player is confronted with an unrepeatable choice which offers an infinite loss, as does fiddling with nature. When presented with such a choice the player must play safe.
The point from this first article is that while physics often sees uncertainty as a lack of information in a complex, but deterministic system, economists have, for at least a hundred yeas, been thinking that there is something much more significant in the problem of uncertainty. In particular, post-Keynesian economists place the issue of "non-ergodicity", that the basic parameters governing the determinism universe could change in time (i.e. the parameter G in physics is not a constant),
The second article argues that the precautionary principle is a blunt instrument that will prevent anything good happening and is "a 90s throwback out of place in an era of "smart solutions" and big data". This implies that "back" in the 90s science was not capable of handling uncertainty, but now with "big data" it is. This second article reads as if there is no problem with Determinism, just with imperfect humans who need science to unlock the hidden mysteries of the universe and by being cautious human progression to enlightenment is delayed.
The problems of science and technology are embedded in Western culture, older than science itself appearing in Hesiod's Works and Days, the first European text on economics, predating Thales by a couple of centuries. Works and Days includes the story of Pandora, which starts with the theft from Zeus of the secret of fire and from Hephaestus and Athena the crafts, by the Titan, Prometheus (‘forethought’). Prometheus had created mankind and passed these secrets on to humans. Zeus, not satisfied with punishing Prometheus for the theft by chaining him to a rock and having his liver eaten each day, decided to punish the mortals as well. He asked Hephaestus to create a the first woman, Pandora, ‘all gifts’ who was given such characteristics as beauty by Aphrodite, cunning by Hermes and the skill to spin thread by Athena. Zeus ensured she was also lazy and foolish.
Pandora was sent to Earth to seduce Prometheus’s dim-witted brother, Epimetheus (‘afterthought’), with a single possession a jar which had been given to her under the strict instruction never to open it. However, Pandora had been given the gift of curiosity, by Zeus’s long-suffering wife Hera, and so one day she opened the jar, which contained all the evils that afflict mankind; disease, strife, war and the need to work, and these all escaped into the world. Realising her mistake, Pandora put the lid on the jar, trapping the last thing left in there – Hope.
My concerns with this article are numerous. Firstly it seems to suggest that concerns with scientific developments are a modern aberration, which ignores the significance of the Pandora myth. Secondly data, including "big data", is barren, it only has meaning in relation to a model, and models are invented by fallible humans. With the best intentions in the world it is possible to apply data to a model, erroneously believed to be correct, and come to the wrong conclusion. A notable example is the recent financial crisis, but science and technology is littered with engineering failures since the Financial Crisis of 2007, for example we have had Fukishima and Deep Water Horizon.
These ideas were touched upon in the third article, which looks to replace the precautionary principle with a less onerous "proactionary principle"
The proactionary principle valorises calculated risk-taking as essential to human progress, where the capacity for progress is taken to define us as a species.What strikes me is the implicit assumption that the uncertainties of the future is an episemological problem (relating to the bounds of knowledge) rather than an ontological problem (whether an answer actually exists). This distinction is fundamental to the difference between the economists Knight and Keynes.
While I was working in the oil exploration business, confronting the practicalities of balancing risks and opportunities resulting from an uncertain world, Alex Kacelnik presented the following game. Consider a small bird in a British who must find 9 worms in a six hour day to survive a cold winter's night. If the bird stays in its current location it has a 50:50 chance of finding 1 or 2 worms in an hour, alternatively the bird could fly to another field, using the the energy gained by eating a worm, and there is a 1 in 6 chance of finding 10 worms in the new field, but a 5 in 6 chance of only finding a single worm in the new field. The first strategy is less "variable" and is conventionally regarded as less "risky", the second strategy is more "risky" and orthodox investment theory (as followed by the firm I was working for) would argue against it as a strategy.
Consider a bird who has another hour to search for worms but had only found 6 in the preceding 5 hours. There is no way that the bird can survive by "playing safe", its only rational choice is to hope to get lucky on the risky strategy. Similarly if a bird takes thae risk in the first our and gets the 9 worms it needs, it can carry on taking a risk all day, potentially building up its breeding reserves with the sure knowledge that it will survive the night. It turns out that, in general, it is beneficial for both the rich and poor (in resources) to take risks, it is the middle classes who are muddling through precariously that should, rationally, play safe.
The essential point is that it is that risk preferences are elusive and it is impossible to rationally identify the optimal choice when the future is unknown and there is the risk of death/extinction. For someone with "no future", taking a risky choice is rational, for someone who might become slightly better off by taking the gamble, it is absurd. The problem for those advocating the proactionary principle is that it is the risk-averse middle classes who, through their taxes, fund the majority of research, and while scientists might be able to produce a number telling them that "all is safe", people are quite capable in arriving at their intuitive assessment of risks.
The proactioanry principle, with its belief in the ability of well trained scientists to come to the right conclusion, was precisely the principle that lead to the financial crisis. One of my favourite characters of the crisis was "X" who was rolled out by Goldman Sachs in August 2007 to explain some losses the bank had suffered as the Credit Crisis emerged (the explanation was reported by the FT and commented on here - open access comment). "X" had invested around $500,000 in an Ivy League degree, followed by a Masters in Statistics from Columbia (not something to be sniffed at) and then a PhD in Finance from Chicago. One might imagine that the sophisticated Goldman Sachs had gone through the ten step programme of the Proactionary Principle (or similar), never the less they got it spectacularly wrong.
If you accept that there exists absolute Truth (either in a Platonic sense, that Reality exists independent of human thought, or a Formal sense, that some statements are tautologies) then you will not worry about Precaution: with a little effort Truth can be identified and progress can proceed from there. The final article in the series observes that, as with a Teddy Bear's picnic, "you're sure of a big surprise" when dealing with technology: we cannot be sure of Truth. This final piece summarises the earlier articles and mentions democracy, but without explaining the relevance of democracy to the scientific process. My view is that mathematical probability is a rhetorical device developed in order to help in consensus setting in the presence of uncertainty.
When faced with uncertainty society needs to come to some form of consensus as how to proceed. Preventing the development of disruptive technologies is potentially harmful to the rich and poor, but disruptive technologies will potentially harm the middle classes. There is no rationally defensible solution to this dilemma, and so society needs to agree a consensus that the majority "buy into". Mauss argued that if uncertainty was addressed in the open, science and religion emerged, if it was done in secret, magic evolved. Scientists should be careful slip into magic and possibly end up burnt.
I'm pretty sure that isn't about Hesiod's "works and days", but "Theogony".
ReplyDeleteThe "works and days" is indeed a wonderful description of small village agricultural society, economy, debt, insurance through friendship and relations with your neighbours, morality, politics. And how it clashed with the law courts that were run in the market square of the larger towns. I found that book reminded me a lot of a small african town, with a village head man and some kind of village council, and a form of still developing land ownership, where in the past much of the land was for common use.
Theogony is about the creation myths and the pantheon of the gods.
I have the Pandora myth being from line 47 in Works and Days, I am using the Oxford World Classics (2008) pp 38-40.
Delete"It turns out that, in general, it is beneficial for both the rich and poor (in resources) to take risks, it is the middle classes who are muddling through precariously that should, rationally, play safe."
ReplyDeleteThat's interesting. But, in my experience, not a very good basis for proof I know, it appears to be the other way round. Middle Class people take risks, poor and rich are either too scared of the consequence of failure or content with the status quo Look at the revolutionaries - middle class in most cases.
I really enjoyed the article. Thanks.
High modernism argues that uncertainty yields to progress. It says the more we can understand and manage risk, the more uncertainty recedes into the background. Thus, the wealthy ask the middle class to abandon hedging heuristics and outsource risk management to the financial sector. "Borrow more, save less" is a substitute for wage growth, and the rich get richer while the middle class gets less resilient. The result is a tightly-coupled system that actually creates more uncertainty, as we saw in 2008 and will likely see again. This is truly a bad bargain for the middle class.
ReplyDeleteActually, it is the rich who take the fewest risks. They have the least to gain and the most to lose. That's why they're always fighting new ideas and new ways of doing business. The current low costs of food and manufactured goods could increase all of our living standards, but the wealthy have done what they can to maintain artificial scarcity. Look at the idiocy with health care. We could afford excellent health care for everyone paying half of what we do, except that this progress would harm too many wealthy, well connected people and corporations.
ReplyDeleteAll good stuff, apart from the last 2 paragraphs. Like Keynes, I suspect that there is some external reality, but we are limited in what we can know about it. In particular, we can never honestly arrive at unconditional probabilities. The proactionary principle requires people to put forward numeric values, and in some cases one doesn't need much education to realise that such numbers must be begging some huge questions.
ReplyDeleteThe financial sector defines risk in terms of variability, which presupposes that the future is - statistically at least - pretty much like the past. Which sometimes it isn't. More currently, the UK JIC assessment of CW use in Syria gives pseudo-probabilities that seem to me credible - as long as one accepts the assumptions behind their calculations, which they have not articulated.
The proactionary approach would be to go ahead on the JIC's assessment. The precautionary approach would be to try to uncover all the hidden assumptions, and to test them. If we take your link between attitude to risk and wealth, presumably a nation 'in a corner' or a dominant nation would attack, whereas we should be more careful. I am not so sure about the economic application. It seems to me that our economy relies on people being psychologically desperate to have a better car than their neighbours even when they are 'objectively' well off. You might also want to distinguish between types of disruptive technologies. Some are scarier than others!
Jorland, in his analysis of the Petersburg "paradox" makes the point that Laplace distinguished "mathematical expectation" from "moral expectation": "mathematical expectation" was applicable to repeatable events, "moral expectation" was a more elusive concept relevant to singular events. I think this distinction is lost but returning to it could help appreciate Keynes and many policy decisions.
Delete