Showing posts with label using mathematics. Show all posts
Showing posts with label using mathematics. Show all posts

Thursday, 13 September 2012

Parade's End

The BBC's transmission of Ford Maddox Ford's Parade's End (a co-production with HBO, and adapted by Tom Stoppard who has a good appreciation of mathematics - Rosencrantz and Guildenstern Are Dead) reminded me that the central character, Christopher Tietjens was an actuary, probably the most famous actuary in English literature.

I came to Ford through his collaborator Joseph Conrad, a teenage interest with Coppola's Apocalypse Now led me to Conrad and a love of sailing inspired me to read all his novels while an undergraduate. I read the first two books of Parade's End after graduating.

I always thought Ford made Tietjens an actuary to highlight his fidelity, his trustworthiness.  Statistics provides the foundation for our belief, our faith, in science, that is why Bertrand Russell (following Poincare) observed
It is important to realise the fundamental position of probability in science. ... As to what is meant by probability, opinions differ (p 301, An Outline of Philosophy)
around the same time Ford was writing Parade's End.  So, in making Tietjens an actuary, and the Second Wrangler of his year, Ford is emphasising Tietjen's  faithfulness, which is most obvious in his relations with his adulterous wife, Sylvia, and the more compatible Valentine Wannop.  Tietjen's character is magnified by placing him alongside the less virtuous, but more successful, MacMaster.

The trajectory of Tietjen's career, the trauma of serving at the front impacted his work as a mathematician,  echoes the real life experience of  Émile Borel.  Borel was the star of his generation of French mathematicians.  His 1894 thesis laid the foundations for modern probability theory and within 16 years he had been appointed Deputy-Director of the most prestigious of the French 'grandes  écoles' the École normale supérieure.  Borel served in the war, but more significanltly, his adopted son was killed at the front.  After the war, Borel pre-empted von Neumann's work in Game Theory and established the Institut de Statistiques de l'Université de Paris, but after 1924 abandoned mathematics for politics, serving as a minister and then, in his seventies was active in the Resistance.  In his lifetime ha had been awarded the Croix de Guerre, Médaille de la Résistance with rosette, and the Grand Croix Légion d'Honneur.

When thinking about this, I remembered that Tietjens, as a statistician, had a series of battles with government regarding the manipulation of figures.  This suggests that not much has changed, with the government still being accused of doctoring the numbers, and modern statisticians still worry about public faith in their figures.

However, in one respect the situation for mathematics now is worse than it was when Parade's End was written.  Would a contemporary author of Ford's prestige choose to make a character a mathematician to emphasise their virtue.  It suggests that the reputation of mathematics has been in decline since the 1920s when Russell and Ford were writing.  Could this be related to G. H. Hardy's 1940 statement
I have never done anything ‘useful’. No discovery of mine has made, or is likely to make, directly or indirectly, for good or ill, the least difference to the amenity of the world. (p 49, A Mathematician's Apology)
Hardy's autobiography is significantly different from Borel's.  Since this time, British (pure) mathematics, which encompasses probability, has abandoned the world that Tietjens and Borel lived in, and isolated itself in academic cloisters, to everyone's detriment.

As a footnote, Hardy, who is credited with introducing continental 'rigour'  (rigour mortis?) , into British mathematics, opposed the Cambridge Mathematical Tripos, on which the Wranglers were selected, because he felt it had ossified British mathematics.  That might have been the case, but the Tripos produced more than just pure mathematicians, it delivered leaders in professions as diverse as the law, medicine, the church, politics, as well as actuaries.

Thursday, 26 July 2012

Teaching kids to gamble

I do a workshop of couple of hours each summer with kids who have finished their Standard Grades (GCSEs in England, i.e they are 16) and are moving on to their Highers (A-level) labelled "Deal or no Deal".  My aim is to encourage students to think a bit differently about maths, in particular to appreciate the relevance tof maths to areas other than the physical / natural sciences.

The workshop is popular, I had thought it was because the TV show was popular but it turns out only a couple of kids watch the show regularly.  The students had been told
The world is uncertain and mathematics is exact, so how can maths help in making decisions about the future?

The workshop will look at a number of simple games that involve making decision when you don't know what will happen, and investigate how maths can help you make the winning choice.  The workshop will show how maths is as important in understanding economics, biology and psychology as in understanding physics or engineering.
I started the workshop by talking about the role of gambling in mythology/religion, referring to Hellenic, Hindu, Chinese and Biblical examples.  I then suggested that gambling (or casting lots) is something common to all cultures, like language, mathematics, music and art are, but physics and agriculture are not.  To explore this point further I split the group into groups of five and ask them to gamble for smarties.  One of the 5 is given more smarties than the rest and the idea is that by playing fair games this imbalance is corrected.  There is a simple java simulation (don't play it too long as one player usually wipes out the others) and I go on to talk about how anthropologists think that gambling prevents the establishment of hierarchies in neolithic societies (eg Mitchell or Altman).

The group was of 25 and so I got them to order themselves by date of birth - there were two pairs sharing the same birthday.  The purpose of this was to enable the group to split into pairs who did not know each other (well).  I then got the pairs to participate in the Ultimatum Game and then talk about how "fairness" is a learnt human concept (Murnighan and Saxon, Henrich et al., Jensen et al.).

I then talk about how Cardano undertook the first mathematical analysis of gambling in response to trying to address the issue of the ethics of gambling.  I give one of the key quotes and then explain how the idea of mathematical expectation comes out of this.  When the group were happy that this was all OK, I present the Petersburg game and ask for offers to play it.  The best offer is 2 I then calculate the game's expectation.

Some of the students identify the risk as the key issue, and after a brief discussion I offer some more games based on dice and ask which one each participant prefers.  Un-remarkably almost half prefer B, the lowest variance game, and I explain that variance is a measure of uncertainty and is associated with risk.

I then ask whether taking a risky decision is always a bad idea.  I present the case of a bird in winter who needs to find food to survive the night and outline this as a game.  The students work out there is a risky and safe strategy (based on variance) and I ask them to try and work out if there is a good strategy, playing safe or taking risks.  After the students play about and we discuss their suggestions I run through a java simulation. The simulation (after a few runs) enables me to demonstrate that the re are two regions in the time-berry state space were taking risks is better than playing safe.

I then discuss the Deal or No Deal game, and, with the results of 22 games, I discuss how a mathematician might analyse the game.  I finish off with Cardano's 350 year old advice, maths is of little practical value but does help in understanding, and the only way to be ethical in gambling is through science.

Monday, 16 July 2012

Is "Risk Intelligence" a dangerous concept?

At a time when there is an emphasis on inter disciplinary research, it is not that useful to divide science into separate categories.  However one distinction that, as a mathematician, I find useful is  between Romantic and Enlightenment science.  Enlightenment science is concerned with the universal, and is developed by collaborative analysis, and, to facilitate both the collaboration and analysis, mathematics plays an important role.  Romanticism emerged in the second half of the eighteenth century and dominated British and German culture in the following half-century.  Romantic science emerges out of senses trying to understand the whole, rather than the parts, sentiment over reason, and is often the result of an individual genius's struggle, frequently against dark forces.  

The archetypical romantic scientists are Alexander Humboldt, Lewis and Clark and Darwin, who developed science through exploration.  The archetypal romantic mathematician is Galois, whose career was hindered by his reliance on intuition rather than the clear proof favoured by the French Academy and a myth has grown up around him that he did his most significant work the night before he was killed in a duel over a love affair.

I have been thinking a lot about the transformation from Enlightenment to Romanticism, because there seems to have been a massive change in the relationship between science and finance in the first half of the nineteenth century, at the time that Romanticism dominated culture.  

I thought about this distinction after listening to Dylan Evans promoting his book Risk Intelligence on the radio. Evans defines Risk Intelligence as the "ability to estimate probabilities accurately" (0.24 on audio), such as the chance a horse wins a race (0.52).  Evans goes on to make the point that our educational system does not train people in making decisions under uncertainty (1.29), what he calls a "twilight zone" (1.40).  Further on he explains that because finance (2:57) became to rely on mathematical models and the "intuitive gambler types were edged out and as a result Wall Street haemorrhaged risk intelligence" (3:35), and this transformation was partly responsible for the failures of finance.

One point I would agree with Dr Evans on, that there needs to be an effort to get concepts around uncertainty onto school curricula.  In fact, at the "Credit Crisis Five Years On" meeting recently I asked Andy Haldane if he felt the Bank of England had a role to play in helping shape curricula in this way.  I think there are significant issues with Evans' view that the chance of a horse winning a race can be estimated accurately, Knightian uncertainty / Keynes' irreducible uncertainty' spring to mind.  However, my main concern is in the claim that it was the use of mathematics that was partially responsible for the failures of finance.

It struck me that in the interview,  Evans seems to be endorsing Romantic Science, there is an emphasis on "intuition" over mathematics in what he says and reference to "twilight", the darkness that is a feature of Romanticism.  But just because Risk Intelligence might be a Romantic idea, doesn't mean to say it is dangerous. The danger with a concept like Risk Intelligence is the emphasis on there are some people who are "good" at risk, there is a link to that other idea that emerged out of Romanticism, the idea of genius, and that mathematics is a hindrance, not a help, in finance.

Why is mathematics important in finance?  It goes back to why Fibonacci's Liber Abaci was a publishing phenomena at a time when books were hand copied - the mathematics it described enabled merchants to write down, disseminate, discuss and improve their financial models.  This is the point of mathematics, whether in physics or finance, and why mathematics is such an important part of Enlightenment science but missing from the Romantic: Enlightenment science is collaborative and needs a common language, and its arguments are written in that language, mathematics.

The problem with modern finance is not in the mathematical models, but in that the models were an end in themselves and not a means for developing a consensus, understanding, knowledge about finance. Banks employed geniuses to develop these models in house that they kept secret, or, they bought black boxes that had been created by geniuses elsewhere.  When mathematicians, such as Phillipe Artzner and Freddy Delbaen  or Michael Gordy, shone a light on the some of the leading industry models, their illumination was blocked by the towering geniuses, the "masters of the universe", working in banking.

When you read the book, rather than listen to the interview, the issue of mathematics in finance is not as significant.  In fact, the whole book rests on a mathematical model of RQ (compare with IQ or Samuelson's "Portfolio IQ", "PQ"), which is implemented on-line for you to test your own RQ.  

Evans claims the motivation behind RQ is a 1986 paper Ceci and Liker where RQ is identified as a type of intelligence uncorrelated with IQ.  Unfortunately, Ceci and Liker's science was not up to much, and within two years this key result was overturned by better analysis.  In fact "RQ" seems to be highly correlated to "IQ".  Evans argues that Banking regulation should involve measuring the RQ of bankers.  The problem is anyone can "game" RQ (when I did the test I got a very high score of 80 because I realised immediately what was going on). 

Having laid the basis of RQ Evans attempts to describe how you can improve your RQ, which boils down to understanding the limitations of what you know.  This does not strike me as particularly novel, and might be labelled as "science", which has been described as "organised scepticism".  

In fact, Evans' criticism of maths in finance turns out to be  more a criticism of calculation in finance, and this has been addressed more tangentially, but more eloquently, by the philosopher Richard Sennett in The Craftsman.  Sennett makes the point that we can all be craftsmen in the modern age, if we learn a craft.  Developing RQ seems no different from becoming skilled in understanding risk.  What is important in becoming a craftsman is hard work in a social context (crafts were traditionally regulated by a guild) and this is why I feel RQ is not simply a banal idea but a dangerous one.

The Romantic Evans seems to believe in an intuitive genius that can be developed outside  of the workshop, I think this is dangerous because I believe the solution to banking
's problems lies in open discussion and debate about the risks and rewards of finance (i.e. science), and that good bankers are skilled craftsmen who have learnt their trade by spending hours developing their skill in the work place.  If something has happened in finance over the last 25 years it is that banks have not been recruiting schoolboy apprentices (I knew three people who left school to go into finance, one to a Bank of England apprenticeship) who they train up over five years, but have been recruiting staff "off the shelf" out of universities and assuming that these academically trained men and women have the right skills for banking.  Or they have recruited ready-made business experts such as Andy Hornby and Fred Goodwin.

Dylan Evans's central argument, that people who are gifted in making risky decisions can be identified and hired to run finance, relegating the need for good ethics, is dangerous.

Thursday, 16 February 2012

The mathematical equation that caused the banks to crash (?!?)

Claiming that the Black-Scholes equation had anything to do with the Credit Crisis of 2007-2009 is a bit like claiming Daimler-Maybach were responsible for bombing Hiroshima. Sure the planes used the internal combustion engine, but the causal relationship is being stretched beyond reason. 

The claim that Black-Scholes was involved in the Credit Crisis has been made by Ian Stewart in a piece, apparently promoting his new book, Seventeen Equations that Changed the World, in The Observer.   

Prof Stewart is the most important promoter of mathematics in the United Kingdom, writing his first book in the early 1970s, Concepts of Modern Mathematics, as an exposition of Bourbaki mathematics. In the 1990s, Bourbaki and ‘New Math’ had faded/failed and Prof Stewart turned his attention tho the more pressing question of introducing mathematics into industry, as explained in the Preface to the Dover Publications edition. 

Given this track-record I am at something of a loss to understand what Prof Stewart was trying to achieve in his Observer article, beyond generating publicity for his book. It is unfortunate that he has chosen to approach a critically important issue, the use of mathematics in finance, in such an unthoughtful way as he has. 

His article is little more than a string of factually incorrect statements, my guess is they are culled from the BBC’s Midas Formula which in turn is based on Lowenstein’s When Genius Failed. For example, the piece begins with “It was the holy grail of investors.”. At its heart is the rather depressing statement, from an investors perspective, that profits should equal the riskless rate, that's more a poison chalice than a holy grail - as one student asked in lectures last week “what's the point of that then”.  

What is more, if it was an esoteric secret, why was the paper initially rejected by the Journal of Political Economy on the basis that there was not enough economics in it. Peter Bernstein, in Against the Gods reports that Black felt it was because he was a mathematician and had no qualifications in economics. The paper then went to the Review of Economics and Statistics, again without success. Bernstein reports that the paper was only published by the Journal of Political Economy after the intervention of influential Chicago academics. 

A more thoughtful assessment is that Black-Scholes enabled the CME and CBOT to justify the introduction of financial options trading, and this is touched upon. But there is no accompanying explanation of the collapse of Bretton-Woods, destroying fixed exchange rate and broadly flat bond yields, meaning that volatility suddenly became an issue for the markets. 

The article, then links Black-Scholes with the sub-prime crisis, a gross mis-representation since Black-Scholes played pretty much no (pricing) role in the collapse of LTCM, let alone in the credit crisis. What Stewart fails miserably to understand is that, in typical practice since the 1987 crash, Black-Scholes has not taken volatility as an input and produced prices, but taken prices to imply volatility. The failure is miserable, because this is the genius of the equation, a consistent measurement tool of the markets’ assessment of risk in the future. 

The piece goes on to claim “The idea behind many financial models goes back to Louis Bachelier in 1900”. Well, Prof Stewart should know better. It would be better to write that “The idea behind many mathematical models goes back to the financial mathematician, Fibonacci in 1202” or “The origins of the Black-Scholes formula lie in the Pascal–Fermat solution to the Problem of Points in 1654. The generally accepted origin of mathematical probability”. These are far more interesting points. 

The article continues
The Black-Scholes equation was based on arbitrage pricing theory, in which both drift and volatility are constant. This assumption is common in financial theory, but it is often false for real markets.
This is nonsense, arbitrage pricing is a concept, constant drift and volatility are implementations. Stewart is happy to talk about the economic-physics Black-Scholes equation but seems ignorant of the mathematical Fundamental Theorem of Asset Pricing
A market is free from arbitrage if and only if a martingale measure exists.
A market is complete and free from arbitrage if and only if a unique martingale measure exists.
Why oh why didn’t Prof Stewart talk about this, admittedly not an equation, but profound mathematics that tells us something significant about markets. 

Mathematics, in my humble opinion, is not equations, it is concerned with ideas and understanding. Understanding the Fundamental Theorem requires an appreciation that probability is not relative frequency, of epistemology (Black Swans), through the idea of completeness, and ethics, being arbitrage free is about fairness, a martingale measure is about equality. This is all in Aristotle and Aquinas, as described by the mathematician James Franklin, and all in financial mathematics, where contemporary papers include words like ‘belief’ and ‘greed’ in their title. 

Stewart finishes his article by suggesting the solution lies in the mathematics of dynamical systems and complexity and adopting analogues from ecology. The Bank of England has followed this line of thinking building on work of Lord May, the biologist and past President of the Royal Society. The problem is, Lord May’s analysis is based on the model of ‘contagion’, the banking system is an ecology through which default spreads, in the same way that mad-cow disease spreads through farms. The response is to build firewalls, quarantine zones, around banks. The Bank of England seems reluctant to develop this line of thinking, possibly because they now realise that the model is just plain wrong. A more appropriate model for banking is the internet, the Credit Crisis was a result of linkages in a network collapsing not of some invisible infectious agent spreading throughout the network. The preferred model now seems to be electricity grids (the article was apparently prompted by a discussion with Andy Haldane). So the Bank is still using a physical analogue, but a better analogue. 

Stewart would see the solution in dynamical systems and complexity, because that is an area of mathematics that the British are strong in. But, dynamical systems are typically ergodic (in the 2003 Review, “dynamical systems and complexity” was labelled “dynamical systems and ergodic theory”) , and, unfortunately, the economy is not ergodic

Mathematicians must start taking the economy seriously, and not try and shoe-horn economic problems into a framework of the mathematics they understand. Ian Stewart is extraordinarily influential in defining what mathematics is and how it can be applied. Academics should not publicly discuss topics they are not expert on, that is the realm of journalism (or blogs). In this article, Prof Stewart has misrepresented mathematics, damaging its reputation. For this he should be ashamed.

Tuesday, 17 January 2012

Why don't more mathematicians see the potential of economics

The question is, how did economics change its attitude to mathematics in the forty years between Håvelmo’s The Probability Approach in Econometrics and his Nobel Prize in 1989, when he was pessimistic about the impact the development of econometrics had had on the practice of economics. Coinciding with Håvelmo’s pessimism, many economists were reacting strongly against the ‘mathematisation’ of economics, evidenced by the fact that before 1925, only around 5% of economics research papers were based on mathematics, but by 1944, the year of Havelmo and von Neumann-Morgenstern’s contributions, this had quintupled to 25%1. While the proportion of economics papers being based on maths has not continued this trajectory, the influence of mathematical economics has and the person most closely associated with this change in economic practice was Paul Samuelson.

Samuelson is widely regarded as the most influential economist to come out of the United States and is possibly the most influential post-war economist in the world. He was the first U.S. citizen to be awarded the Nobel Prize in Economics in 1970 because “more than any other contemporary economist, he has contributed to raising the general analytical and methodological level in economic science”2. He studied at the University of Chicago and then Harvard, were he obtained his doctorate in 1941. In 1940 he was appointed to the economics department of M.I.T., in the final years of the war he worked in Wiener’s group looking at gun control problems3, where he would remain for the rest of his life. Samuelson would comment that “I was vaccinated early to understand that economics and physics could share the same formal mathematical theorems”.

In 1947 Samuelson published Foundations of Economic Analysis, which laid out the mathematics Samuelson felt was needed to understand economics. It is said that von Neumann was invited to write a review Foundations in 1947 declined because “one would think the book about contemporary with Newton”. Von Neumann, like many mathematicians who looked at economics, believed economics needed better maths than it was being offered4. In 1948 Samuelson published the first edition of his most famous work, Economics: An Introductory Analysis, one of the most influential textbooks on economics ever published, it has run into nineteen editions and sold over four million copies.

There appears to be a contradiction, Håvelmo seems to think his introduction of mathematics into economics was a failure, while Samuelson’s status seems to suggest mathematics came to dominate economics. In the face of contradiction, science should look for distinction.

I think the clue is in Samuelson’s attachment to “formal mathematical theorems”, and that his conception of mathematics was very different from that of the earlier generation of mathematicians that included everyone from Newton and Poincaré to von Neumann, Wiener and Kolmogorov.

A potted history of the philosophy of mathematics is that the numerologist Plato came up with the Theory of Forms and then Euclid produced The Elements which was supposed to capture the indubitability, the certainty, and immutability, the permanence, of mathematics on the basis that mathematical objects where Real representations of Forms. This was used by St Augustine of Hippo as evidence for the indubitability and immutability of God, embedding into western European culture the indubitability and immutability of mathematics. The identification of non-Euclidean geometries in the nineteenth century destroyed this edifice and the reaction was the attempt to lay the Foundations of Mathematics, not on the basis of geometry but on the logic of the natural numbers. Frege’s logicist attempt collapsed with Russell’s paradox and attention turned to Hilbert’s formalism to provide a non-Platonic foundation for mathematics. The key idea behind Formalism is that, unlike Platonic Realism, mathematical objects have no meaning outside mathematics, the discipline is a game played with symbols that have no relevance to human experience.

The Platonist, Kurt Gödel, according to von Neumann, has “shown that Hilbert’s program is essentially hopeless” and
The very concept of “absolute” mathematical rigour is not immutable. The variability of the concept of rigour shows that something else besides mathematical abstraction must enter into the makeup of mathematics5

Mathematics split into two broad streams. Applied mathematics, practised by the likes of von Neumann and Turing, responded by focussing on real-world ‘special cases’, such as modelling the brain6. Pure mathematics took the opposite approach, emphasising the generalisation of special cases, as practised by Bourbaki and Hilbert’s heirs.

Formalism began to dominate mathematics in the 1940s-1950s. Mathematics was about ‘rigorous’, whatever that means, deduction from axioms and definitions to theorems. Explanatory, natural,  language and, possibly worse, pictures, were to be removed from mathematics. The “new math” program of the 1960s was a consequence of this Formalist-Bourbaki dominance of mathematics.

It is difficult to give a definitive explanation for why Formalism became dominant, but it is often associated with the emergence of logical–positivism, a somewhat incoherent synthesis of Mach’s desire to base science only on phenomena (which rejected the atom), mathematical deduction and Comte’s views on the unity of the physical and social sciences. Logical-positivism dominated western science after the Second World War, spreading out from its heart in central European physics, carried by refugees from Nazism.

The consequences of Formalism were felt most keenly in physics. Richard Feynman, the physicists’ favourite physicist, hated its abandonment of relevance. Murray Gell-Mann, another Noble Laureate physicist, commented in 1992 that the Formalist-Bourbaki era seemed to be over

abstract mathematics reached out in so many directions and became so seemingly abstruse that it appeared to have left physics far behind, so that among all the new structures being explored by mathematicians, the fraction that would even be of any interest to science would be so small as not to make it worth the time of a scientist to study them.

But all that has changed in the last decade or two. It has turned out that the apparent divergence of pure mathematics from science was partly an illusion produced by obscurantist, ultra-rigorous language used by mathematicians, especially those of a Bourbaki persuasion, and their reluctance to write up non–trivial examples in explicit detail. When demystified, large chunks of modern mathematics turn out to be connected with physics and other sciences, and these chunks are mostly in or near the most prestigious parts of mathematics, such as differential topology, where geometry, algebra and analysis come together. Pure mathematics and science are finally being reunited and mercifully, the Bourbaki plague is dying out.7

Economics has always doubted its credentials. Laplace saw the physical sciences resting on calculus, while the social sciences would rest on probability8, but classical economists, like Walras, Jevons and Menger, wanted their emerging discipline economics to have the same status as Newton’s physics, and so mimicked physics. Samuelson was looking to do essentially the same thing, economics would be indubitable and immutable if it looked like Formalist mathematics, and in this respect he has been successful, the status of economics has grown faster than the growth of maths in economics. However, while the general status of economics has exploded, its usefulness to most users of economics, such as those in the financial markets, has collapsed. Trading floors are recruiting engineers and physicists, who always looked for the relevance of mathematics, in preference to economists (or post-graduate mathematicians).

My answer to the question “why don’t more economists see the potential of mathematics” is both simple and complex. Economists have, in the main, been looking at a peculiar manifestation of mathematics - Formalist-Bourbaki mathematics - a type of mathematics that emerged in the 1920s in response to an intellectual crisis in the Foundations of Mathematics. Economists have either embraced it, as Samuelson did, or were repulsed by it, as Friedman was.

Why this type of mathematics, a type of maths that would have been alien to the great mathematicians of the twentieth century like Wiener, von Neumann, Kolmogorov and Turing, became dominant and was adopted by economics is more complex and possibly inexplicable. The question is, can academic mathematics return to its roots in relevance, or will it wither in its ivory towers?

Notes

1 Mirowski (1991, pp 150–151)
3 MacKenzie (2008, p 63–64)
4 Mirowski (1992, p 134)
5 Mirowski (1992, p 122, quoting von Neumann)
6 Mirowski (1992, p 122–124)
7 Gell-Mann (1992, p 7)
8 Katz (1993, p 685)

References

    Gell-Mann, M. (1992). Nature conformable to herself. Bulletin of the Santa Fe Institute, 7(1):7–8.
    Katz, V. J. (1993). A History of Mathematics: an Introduction. Haper Collins.
    MacKenzie, D. (2008). An Engine, Not a Camera: How Financial Models Shape Markets. The MIT Press.
    Mirowski, P. (1991). The when, the how and the why of mathematical expression in the history of economic analysis. Journal of Economic Perspectives, 5(1):145–157.
    Mirowski, P. (1992). What were von Neumannn and Morgenstern trying to accomplish?. In Weintraub, E. R., editor, Toward a History of Game Theory, pages 113–150. Duke University Press.

Friday, 6 January 2012

Why don't more economists see the potential of mathematics

A research student, working in econometrics has e-mailed me with the comment
I am a little confused why many economists do not see the potential of mathematics.
The discipline of econometrics was introduced in the 1940’s with the key monograph being Trygve Håvelmo’s The Probability Approach in Econometrics. Håvelmo’s motivation for writing the paper is eloquently stated in the preface
The method of econometric research aims, essentially, at a conjunction of economic theory and actual measurements, using the theory and technique of statistical inference as a bridge pier. But the bridge itself was never completely built. So far, the common procedure has been, first to construct an economic theory involving exact functional relationships, then to compare this theory with some actual measurements, and, finally, “to judge” whether the correspondence is “good” or “bad”. Tools of statistical inference have been introduced, in some degree, to support such judgements, e.g., the calculation of a few standard errors and multiple-correlation coefficients. The application of such simple “statistics” has been considered legitimate, while, at the same time, the adoption of definite probability models has been deemed a crime in economic research, a violation of the very nature of economic data. That is to say, it has been considered legitimate to use some of the tools developed in statistical theory without accepting the very foundation upon which statistical theory is built. For no tool developed in the theory of statistics has any meaning– except, perhaps, for descriptive purposes –without being referred to some stochastic scheme.

The reluctance among economists to accept probability models as a basis for economic research has, it seems, been founded upon a very narrow concept of probability and random variables. Probability schemes, it is held, apply only to such phenomena as lottery drawings, or, at best, to those series of observations where each observation may be considered as an independent drawing from one and the same “population”. From this point of view it has been argued, e.g., that most economic time series do not conform well to any probability model, “because the successive observations are not independent”. But it is not necessary that the observations should be independent and that they should all follow the same one–dimensional probability law. It is sufficient to assume that the whole set of, say n, observations may be considered as one observation of n variables (or a “sample point”) following an n-dimensional joint probability law, the “existence” of which may be purely hypothetical. Then, one can test hypotheses regarding this joint probability law, and draw inference as to its possible form, by means of one sample point (in n dimensions). Modern statistical theory has made considerable progress in solving such problems of statistical inference.

In fact, if we consider actual economic research–even that carried on by people who oppose the use of probability schemes–we find that it rests, ultimately, upon some, perhaps very vague, notion of probability and random variables. For whenever we apply a theory to facts we do not–and we do not expect to–obtain exact agreement. Certain discrepancies are classified as “admissible”, others as “practically impossible” under the assumptions of the theory. And the principle of such classification is itself a theoretical scheme, namely one in which the vague expressions “practically impossible” or “almost certain” are replaced by “the probability is near to zero”, or “the probability is near to one”.
This is nothing but a convenient way of expressing opinions about real phenomena. But the probability concept has the advantage that it is “analytic”, we can derive new statements from it by the rules of logic.
Håvelmo’s argument can be split into four key points. If economics is to be regarded as ‘scientific’, it needs to take probability theory seriously. He then notes that economists have taken a naive approach to probability, and possibly mathematics in general, and introduces the Lagrangian idea of representing n points in one dimensional space by one point in n-dimensional space. Finally he makes Poincaré’s point that probability is a convenient solution, it makes the scientist’s life easier, and finally he makes Feller’s point that it enables the creation of new knowledge, new statements.

Håvelmo then goes on to tackle the issue that goes back as far as Cicero, at least, “there is no foreknowledge of things that happen by chance” by making the critical observation, nature looks stable because we look at it in a particular way
“In the natural sciences we have stable laws”, means not much more and not much less than this: The natural sciences have chosen very fruitful ways of looking on physical reality.
Håvelmo is saying that if economists look at the world in a different way, if the right analytical tools are available to them, they may be able to identify stable laws.

At about the same time, Oskar Morgenstern was working with John von Nueumann on The Theory of Games and Economic Behavior, a “big book because they wrote it twice, once in symbols for mathematicians and once in prose for economists”. Morgenstern begins the book by describing the landscape. On the second page he, makes the case for using mathematics in economics, just as Håvelmo had, but with a more comprehensive argument. Morgenstern reviews the case as to why mathematics is inappropriate to economics, no doubt with von Neumann at his shoulder,
The arguments often heard that because of the human element, of psychological factors etc., or because there is – allegedly – no measurement of important factors, mathematics will find no application [in economics] [von Neumann and Morgenstern 1967 p 3]
However, Morgenstern points out that Aristotle had the same opinion of the use of mathematics in physics
Almost all these objections have been made, or might have been made, many centuries ago in fields fields where mathematics is now the chief instrument of analysis.
While measurement may appear difficult in economics, measurement appeared difficult before the time of Albert the Great, again before Newton fixed time and space, when objects were either ‘hot’ or ‘cold’ or before the idea of potential energy being released into kinetic energy emerged.
The reason why mathematics has not been more successful in economics must, consequently, be found elsewhere. The lack of real success is largely due to a combination of unfavourable circumstances, some of which can be removed gradually. To begin with economic problems were not formulated clearly and are often stated in such vague terms as to make mathematical treatment a priori appear hopeless because it is quite uncertain what the problems really are. There is no point in using exact methods where there is no clarity in the concepts and the issues to which they are to be applied. Consequently the initial task is to clarify the knowledge of the matter by further careful description. But even in those parts of economics where the descriptive problem has been handled more satisfactorily, mathematical tools have seldom been used appropriately. They were either inadequately handled, as in the attempts to determine a general economic equilibrium …, or they led to mere translations from a literary form of expression into symbols, without any subsequent mathematical analysis. [von Neumann and Morgenstern1967, p 4]
Morgenstern makes the critical observation, that the ‘correct’ use of mathematics in science leads to the creation of new mathematics
The decisive phase of the application of mathematics to physics – Newton’s creation of a rational discipline of mechanics – brought about, and can hardly be separated from, the discovery of [calculus]. (There are several other examples, but none stronger than this.)
The importance of social phenomena, the wealth and multiplicity of their manifestations, and the complexity of their structure, are at least equal to those in physics. It is therefore expected – or feared – that the mathematical discoveries of a stature comparable to that of calculus will be needed in order to produce decisive success in this field. [von Neumann and Morgenstern 1967 p 5]
In 1989 Håvelmo was awarded the Nobel Prize in Economics “for his clarification of the probability theory foundations of econometrics and his analyses of simultaneous economic structures”. In his speech, the economist Håvelmo reflected on the impact of his work,
To some extent my conclusions [are] in a way negative. I [draw] attention to the – in itself sad – result that the new and, as we had thought, more satisfactory methods of measuring interrelations in economic life had caused some concern among those who had tried the new methods in practical work. It was found that the economic theories which we had inherited and believed in, were in fact less stringent than one could have been led to think by previous more rudimentary methods of measurement. To my mind this conclusion is not in itself totally negative. If the improved methods could be believed to show the truth, it is certainly better to know it. Also for practical economic policy it is useful to know this, because it may be possible to take preventive measures to reduce uncertainty. I also mentioned another thing that perhaps could be blamed for results that were not as good as one might have hoped for, namely economic theory in itself. The basis of econometrics, the economic theories that we had been led to believe in by our forefathers, were perhaps not good enough. It is quite obvious that if the theories we build to simulate actual economic life are not sufficiently realistic, that is, if the data we get to work on in practice are not produced the way that economic theories suggest, then it is rather meaningless to confront actual observations with relations that describe something else. [ Prize Lecture Lecture to the memory of Alfred Nobel  ]
Håvelmo’s aim in the 1940s, along with that of John von Neuman, had been to improve economic methodology, the consequence was, in Håvelmo’s case, was that it highlighted deficiencies in economic theory. The question is, what happened in economics in the forty years between Håvelmo’s paper on econometrics and his Nobel Prize in 1989 to lead to such a negative reflection on the development of economics. I shall come back to this in my next post.


References

   J. von Neumann and O. Morgenstern. Theory of Games and Economic Behavior. Wiley, 3rd edition, 1967.

Friday, 4 November 2011

De Coding Da Vinci

It is well known that Leonardo Da Vinci became interested in the "golden ratio" or "divine proportion".  It is somewhat less well known is he learnt about the number from Lucca Pacioli, the Francisican friar and grandfather of accounting.  What is virtually unknown is that Pacioli probably learnt his mathematics from the financial mathematician and artist, Piero della Francesca  (as an artist).



I have written an article, Decoding Da Vinci: Finance, functions and art, on this for plus! an on-line magazine aimed at youngster. The piece explains why the ratio was considered divine because of its form as a continued fraction, and how another financial mathematician laid the foundations for functional analysis by popularising decimal notation.

Friday, 2 September 2011

Maths and the markets


Dr Jack Stilgoe, a science policy wonk, has been thinking about Responsible innovation in financial services and asks the question, in relation to the Credit Crisis

Could mathematicians have done more to ensure that their models weren't abused, or is it not really about maths at all?

Jack's deceptively simple question is incredibly intricate.  There are many commentators who argue that the complexity of modern markets is such that they are mathematically intractable, and the best approach is analysis through discourse, as was popular in the Dark Ages and between the Black Death and Francis Bacon and Galileo.  My (biased) opinion is that these views are held principally by those educated in the ethos that developed before the collapse of Bretton-Woods, when a deterministic economy was managed by wise sages.  Unfortunately the world is not deterministic and the sages could not hope to manage the economy by agreeing treaties in luxury hotels.

However, mathematics itself cannot present a unified front.  We have Paul Wilmott and Nicolas Nassim Taleb arguing that the mathematical techniques that dominate the markets today, that of Ioannis Karatzas, Steven Shreve, Mark Davis (whom Wilmott has famously libelled in an ad hominen attack) and a Marek Musiela, to name a few, is the wrong sort of mathematics.  This is rather like someone claiming a Toyota Prius is not really a car in comparison to a Dodge Pickup, the fact that the Prius is unfamiliar does not mean it is not technically superior.

However, this does not mean that the academic discipline of financial mathematics does not have some issues to address.  The publication of the Heath-Jarrow-Morton framework created a demand for stochastic analysis skills in the markets, displacing the skills in the numerical solution of deterministic differential equations familiar to Wilmott, Taleb, physics and engineering.  This demand was met by the universities with a plethora of Financial Mathematics Masters degrees.  I feel that now the markets have moved on, but whether many of the MScs are keeping up, I am not so sure.

Part of the problem is that many academic mathematicians are more comfortable walking across campus to chat to their colleagues in the economics or finance departments than talk to mathematicians with direct experience of the markets, such as Claude Shannon, Edward Thorp and James Simons.  This means that the orthodoxy of Samuelson and his progeny dominates and ideas such as the Kelly Criterion, and those of stochastic control familiar to electrical engineering, have been missing from the rarefied curricula of some financial maths degrees.

But all this is a discussion of plumbing of the markets, a utility, and mathematics is not really a utility.  Mathematics is a science.

For Laplace, the roll of a dice is not random, given precise information of the position, orientation and velocity of a dice when it left a cup, the result of the roll was perfectly predictable.  At the heart of Laplace's determinism was knowledge, and `probability' was a measure of ignorance, and not of 'chance'. As a product of the Enlightenment, Bernoulli's God is replaced by 'an intellect', Laplace's demon.  The positions of Laplace and Bernoulli, however, differ significantly from Cicero who, in De Divinatione, distinguished between the predictable (eclipses), the foreseeable (the weather) and the random (finding of a treasure).  But between the Bernoulli's religious and Laplace's atheist conceptions of predestination, there is more than just a change in wording; there is a huge philosophical divide that was one of the key achievements of the Enlightenment.

A persistent problem with determinism is that it, logically, can lead to a collapse in moral responsibility. The syllogistic argument is:
Premise 1.        Actions are either pre-determined or random.
 
Premise 2         If an action is pre-determined, the entity who performed the action is not morally responsible.  
Premise 3.        If an action is random, the entity that performed the action is not morally responsible.   
Conclusion.    No one is morally responsible for their actions.

An achievement of the Enlightenment was to realise that moral responsibility should not sit in the conclusion, but as a premise, and the argument became.
 
Premise 1.        People should be held morally responsible for their actions.  
Premise 2.        If someone (i.e. a child) cannot foresee the consequences of their actions they cannot be held morally responsible for their actions.
Conclusion.    Moral responsibility requires that there be foresight.

In order to be 'morally responsible', people needed to have a degree of foresight, which can only be obtained through knowledge, or science.  This is the fundamental purpose of science, to enable people to take responsibility for their actions, whether related to the safety of industry or personal diet.  This was reflected in Humboldt's view that education should turn 'children into people', but very different from Bacon's opinion that 'knowledge is power'.

Society needs science to interact with the markets because science creates knowledge, knowledge enables foresight and foresight leads to responsibility.  If there is no science of finance, there can be no responsibility in the markets (if the Enlightenment was right).

Poincare dismissed the idea of 'science for science's sake', science is not a recreational pursuit.  Scientists need to ask the difficult questions at the extremities of knowledge and mathematics role is to tackle the questions that cannot be answered by experimentation.  This is why the the $3 billon investment in the Large Hadron Collider, in looking for the Higgs Boson, is seeking to prove a mathematical derivation.  Physical sciences are impotent in reaching out to the boundaries of knowledge without mathematics clearing the path.

The financial markets cannot be experimented on.  The very fact that they are complex means that the only tool science has in trying to understand them is mathematics.  The fact that the, predominantly, deterministic mathematics based on physical phenomena that most people are familiar with (even frequentist or objective probability is rooted in the 'physical' act of counting) is insufficient to understand the markets does not mean that mathematics will not provide the key to understanding the markets.  The point is, it will be "mathematics, but not as we know it", it needs to be created.

If society wants to understand the markets, and really wants them to act responsibility, it needs to fund financial mathematics on a par with the investment made into the physically very small or the very distant.

Thursday, 28 July 2011

History repeating itself?

The macro-economics of August 2011 looks much the same as August 1971, but have we learnt anything about the markets over the past forty years?

Following the 1929 Crash and the consequent world-wide Depression governments around the world had devalued their currencies in order to make their products more competitive in foreign markets. These ‘beggar-thy-neighbour’ policies created a deflationary spiral that magnified the effects of the Crash. In 1944 the Allied powers met at Bretton–Woods, in New Hampshire, and agreed to fix the gold price of the main currencies, the US$ was fixed to gold at $35/oz, while other currencies were pegged to the dollar with the pound sterling being set at $4.03. Bretton–Woods also established the International Monetary Fund and World Bank.

By the late 1960s the Bretton–Woods system was beginning to creak as the Germans and Japanese exported to the Americans,while the U.S. poured money into th e war in Vietnam. As gold was sucked out of the U.S. the system began to look untenable. In 1971, foreign governments demanded that the U.S. honour its “promise to pay” and convert their dollar notes into gold, in July Switzerland converted $ 50 million into gold. There was an arbitrage, buy gold with dollars and then sell the gold for Deutsche Marks. On August 15, 1971 the U.S. President, Nixon, responded to these activities by abandoning the gold-standard, the “promise to pay”. Bretton Woods collapsed and foreign exchange rates stopped being
certain. 

As a consequence of the collapse of the Bretton–Woods system of exchange rates central banks  were forced to change the interest rates more frequently. In simplistic terms, the level of interest rates has two effects. If rates are low people will borrow from banks, who will create money for the economy and this may generate inflation which devalues a currency. If interest rates are high, and the currency stable, foreign investors will like to deposit their spare cash in banks paying the high rates of interest, raising demand for the currency. After 1972 interest rate policy became a key lever that governments had to control their economies. In the 27 years between 1945 and autumn 1972, when Bretton–Woods collapsed, the Bank of England changed its lending rate 43 times, in the 27 years after 1972, it changed them 223 times, about every 45 days. Finance had moved from a world of deterministic control to one of stochastic control, and people had to think more carefully about controlling their financial risks.

Business responded to this change in the economic environment by returning to the derivatives markets, using them to provide the tools to hedge the risks, whether as a borrower or a lender, of the fluctuating the interest rates. In the same year that Bretton-Woods collapsed, Nixon appointed William Casey, a spy and tax lawyer, as director of the Securities and Exchange Commission (SEC) and the path for the the derivatives exchanges was opened. A currency future had been created in New York in 1970, but had foundered. However when, on May 16, 1972, the Merc began trading futures on seven currencies the market for FX risk management was there and the Merc was rescued from the doldrums of the  1960s.

While futures or forward contracts, firm agreements to buy or sell an asset at a fixed price in the  future, existed in an ethical and legal limbo, option contracts, contracts that gave the holder the right, but not the obligation, to trade were closer to the devil. As late as the 1960s officials of the SEC had compared options to thalidomide and marijuana and claimed that there had never been a case of market manipulation that did not involve options [MacKenzie2008, p 149]. Casey, and the SEC, cleared the Chicago Board of Options Exchange and it opened on April 26, 1973. Within days, the Journal of Political Economy, the house journal of the Chicago economists, published a paper, The Pricing of Options and Corporate Liabilities by Myron Scholes and Fischer Black.

The CBOT launched the first interest rate derivative in 1975, where the underlying was linked to  mortgages, in 1976 the Merc introduced a future on 30-day U.S. Government Treasury bills, and CBOT launched a future on 30-year U.S. Treasury bonds in 1978. The London International Financial Futures Exchange opened in 1982 and in 1984 the equivalent of a stock–market index for interest rates, the London Interbank Offered Rate, LIBOR, began to be published, and futures, known, confusingly, as Eurodollar futures, began to be traded on the Merc based on this index.

It was not only the capital markets that were transformed in the 1970s. Up until 1973 the price  of oil was set by the Railroad Commission of Texas, who controlled the oil production in Texas, and hence the oil price in the U.S., which as the world’s main oil consumer, effectively set the world price. Following the collapse of Bretton–Woods, the U.S. dollar’s value fell and as a consequence, the real income non-U.S. oil producers fell, and the Organisation of Oil Producing and Exporting Companies, OPEC, began to price their oil in gold and agree production quotas, setting the global–price. In 1973 the Middle-Eastern members of OPEC imposed an embargo on the west, following the defeat of the Syrian–Egyptian attack on Israel, and cut production, forcing the price of oil up. This, in-turn, prompted the development of alternative oil–provinces, notably the North Sea between the U.K. and Norway, which had been
previously un-economic. When this extra production hit the market, in the 1980s, just as demand fell, as consumers cut back consumption in response to higher-prices, and Iran and Iraq exceeded their quotas to fund their war (1980–1988), prices collapsed along with OPEC’s cohesiveness. In 1985 OPEC’s price-setting mechanism was abandoned, and another key economic input became a stochastic process, and in response, in 1988, the London based International Petroleum Exchange introduced the Brent oil futures  contract. 

Behind Black Monday, the failures of LTCM and The Equitable and the Credit Crisis of 2007–2008 is the fact that the financial world became stochastic in the aftermath of the collapse of Bretton–Woods,. The derivative markets did not spontaneously appear, they developed in response to the increased uncertainty in key economic drivers.

The world of 1980 was very different to that between 1918 and 1970 when exchange, rates,  interest  rates and commodity prices were being controlled by the great and the good. One view might be that the  abandonment of Bretton–Woods had lead to the chaos of 1970s stagflation, another is that Bretton–Woods shattered under the strains of trying to confine the economy to a specific path. The derivative markets emerged in response to freeing the economy, the deterministic  policies created impossible stresses in the global economy, and derivatives enabled risk–management in the resulting uncertainty.

The derivative markets were fundamentally different from the stock-markets, where decisions were made based on an investor’s judgement of the market fundamentals, and the core skills where in understanding economics and a company’s balance sheet, finance. The derivative markets were concerned with comparing the price differences between similar assets across different markets. It was not about the study of objects, but the relations between objects, and the derivative markets needed mathematical skills. In the aftermath of Black Monday significant numbers of applied mathematicians, physicists and engineers began working in the derivative markets, the ‘quants’ had  arrived.

The events of August 2011 are not so different to those of August 1971, but then again these  events are not so different to those of the 1690s!

Trading in stocks did not take off in England until after the Glorious Revolution of 1688. It is  often assumed that this was because, as Geoffrey Poitras puts it, “William III was accompanied by an influx of Dutch persons and practises”. However a market does not create itself and, according to Anne Murphy, a historian who has worked as a currency trader, the root cause of the explosion of stock-market trading, and the accompanying boom, was the Nine Years War [Murphy2009, p 10–14], [Poitras2000, pp 281–285]. Murphy points to a contemporary account written by John Houghton in 1694
a great many Stocks have arisen since the war with France; for Trade being obstructed at Sea; few that had money were willing it should lie idle [Murphy2009, quoting on p 12]
It was not just the shortage of opportunities to participate in regular trade that stimulated the  boom. England had grown wealthy under Charles II and alongside the increase in wealth was a growth in the financial services industry, involving life insurance and annuities, general insurance and the trade in the shares of the handful of joint-stock companies that existed at the time, such as the East India, Hudson Bay and Royal Africa [Murphy2009, p 12–19]. Evidence of this growth in financial services is provided by the 1673 Act of Common Council that looked to put an end to “usurious contracts, false Chevelance, and other crafty deceits” [Murphy2009, p 83].

The additional risks of sea trade resulting from the war with France, and the actions of privateers  meant that merchants looked for domestic investment opportunities and the number of joint-stock companies exploded, and with more companies came a more active stock market. John Houghton describes how the market worked
The manner of managing Trade is this; the Monied Man goes amongst the Brokers  (which are chiefly upon the Exchange [Alley], and at Jonathan’s Coffee House [the origins of the London Stock Exchange], sometimes at Garraway’s and at some other Coffee Houses) and asks how Stocks go? and upon Information, bids the Broker to buy or sell so many Shares of such and such Stocks if he can, at such  and such Prizes [Poitras2000, quoting on p 288].
Brokers put buyers and sellers in touch with each other, for a commission but without actually  taking possession of any asset. Alongside the monied men and brokers were the stock-jobbers or dealers, the speculators, providing liquidity in the market and trying to turn a profit from their trading. This dual-system, separating brokers from stock-jobbers, existed in London up until the ‘Big Bang’ of 1987.

In 1719 Daniel Defoe published Robinson Crusoe and wrote an article The Anatomy of Exchange  Alley in which he described stockjobbing as
a trade founded in fraud, born of deceit, and nourished by trick, cheat, wheedle,  forgeries, falsehoods, and all sorts of delusions; coining false news, this way good, this way bad; whispering imaginary terrors, frights hopes, expectations, and then preying upon the weakness of those whose imaginations they have wrought upon [Poitras2000, quoted in p 290]
An observation, mentioned by Defoe but more explicitly stated by Thomas Mortimer in 1761, concerned  the type of person involved in stockjobbing. Mortimer makes the point that there are three types of stock-jobber, firstly foreigners, secondly gentry, merchants and tradesmen, and finally, and “by far the greatest number”, people
with very little, and often, no property at all in the funds, who job in them on  credit, and transact more business in several government securities in one hour, without having a shilling of property in any of them, than the real proprietors of thousand transact in several years. [Poitras2000, quoted in p 291]
It was not only stocks that were being traded in the first half of the 1690s. Murphy estimates  that around 40% of the trades between 1692 and 1695 were in stock options, that were being traded in order to manage the risks of stock trading. [Murphy2009, p 24–30] Evidence of the widespread use of options comes in 1720 when Colley Cibber, who would become Poet Laureate in 1730,, wrote a play, The Refusal (‘The Option’), describing the action in Exchange  Alley
There you’ll see a duke dangling after a director; here a peer and ‘prentice haggling  for an eighth; there a Jew and a parson making up the differences; there a young woman of quality buying bears of a Quaker; and there an old one selling refusals to a lieutenant of grenadiers [Ackroyd2001, p 308]
Clearly, in 1720 the public were familiar with the trading of derivatives, pretty much everyone was  involved, and social, religious and political differences were forgotten in the markets.

The stock market boom that started in the late 1680s had gone bust by the middle of the next decade. At the time it was popular to blame stock-jobbers for destabilising the economy by either ramping worthless stock or undermining a going concern (depending on your point of view) [Murphy2009, p 33], while, more fundamentally, the stock market was attacked for turning “men away from honest and beneficial trades” [Murphy2009, p 68]. More rational explanations were that many of the joint-stock companies were mis-managed and the government’s need for cash sucked funds out of the market, causing prices to collapse [Murphy2009, p 35].

Anyone born before 1970, unless they have actually worked in the markets, find it difficult to  understand the changes in the financial environment that had occurred after the collapse of Bretton–Woods. Derivatives would not start appearing on undergraduate courses in universities until the mid 1990s, and even then, for many of the lecturers in economics and finance educated in the post-war deterministic economies, they would be unfamiliar beasts. This meant that at the turn of the century there was a dire shortage of people with the skills to understand the complex world of derivatives [Tett2009, p 68], which required a unique grasp of financial theory, market practises, applied mathematics, probability and
statistics.

At the same time, some banks chose business managers as their chief executives, such as Fred  Goodwin at the Royal Bank of Scotland or Andy Hornby at Halifax–Bank of Scotland, rather than ‘bankers’ bought up on the basic process, and the uncertainties, of converting credit into cash. The consequence of this was that some banks focused on efficient profit generation, the allocation of scarce resources, at the expense of monitoring the risks of their activities, managing an uncertain world. This manifested itself by firms buying in expertise, in the form of ‘black–box’ software systems to value the CDOs combined with external (or internal) consultants for advice. Some banks were not only out–sourcing their call centres, but their brains as well.

In the lead–up to the Crisis of 2007–2008, RBS sponsored sports-stars to the tune of 200  million pounds, in the same period they invested nothing in mathematics. J.P. Morgan was different, they developed, in–house, Value at Risk, CreditMetrics and employed David Li as he thought about using copulas to price CDOs. The quants that that they employed were able to develop these tools because they had a deep understanding of the markets and mathematics, and critically, how and where the mathematical models were weak and needed to be augmented. Not only did J.P. Morgan recognise the need for these skills, a feature shared by all the serious investment banks, but in disseminating their models they advertised that their expertise was a fundamental component of “first class banking in a first class  way”.

The benefit of J.P. Morgan’s approach is described by Gillian Tett in her account of the Credit Crisis, Fool’s Gold. The background is it is 2005–2006 and J.P. Morgan’s shareholders are putting  the bank’s managers under intense pressure to mimic the revenues being reported by other  investment banks, who were actively investing in CDOs of MBS.
[The J.P. Morgan chief executive] made it clear that he wanted a mortgage  production line, so Winters had duly asked his staff to re–examine how to create a profitable business selling mortgage–based CDOs.
When they crunched the numbers, though, they ran into a problem. “There doesn’t seem to be a way to make money on these structures,” Brian Zeitlin, one of the bankers who worked in the CDO division reported. …
Reluctantly, Winters told the J.P. Morgan management should not open the spigots on its pipeline after all. The decision was greatly frustrating, though. The other banks were pushing JPMorgan Chase further and further down the league tables  largely due to the bonanza  from their mortgage pipelines. So were they just ignoring the risks? Or had they found some alchemy that made the economics of their
machines work? [Tett2009, pp 148–151]
The J.P. Morgan quants had taken the prices the traders were observing in the market and  reverse engineered them, just like they did with the Black–Scholes pricing formula, extracting the key parameter, ρ. When they told the traders that the basis of the prices in the market was ρ = 0.3, the traders could not believe it. A correlation of ρ = 0.3 implies that there is only a small linkage between defaults, reflecting the fact that if BP went bust it did not mean that Tesco would follow suite. But anyone who had seen an economic downturn would be familiar with whole streets being derelict, the correlation of mortgage defaults was unknown, but not insignificant.

There is another aspect to the approach taken by the banks that weathered the storms  of 2007–2008. While there is frequently the claim that the Credit Crisis was a global phenomenon, it was not. Asian banks were unaffected and British and American banks suffered far more than French or German banks. The explanation that banks were not as involved as RBS or Merrill Lynch is not a satisfactory answer (BNP Paribas, SocGen and Deutsche Bank were all heavily involved in credit derivatives. [FCIC2011, Fig  20.4, for example]). U.S. bankers have been known to suggest that the non Anglo-Saxon banks played fast and loose with accounting rules, not declaring their losses on credit derivatives. The response to this criticism from French mathematicians is typically Gallic, and Cartesian, “They thought the models were wrong before August 2007, they were  certain they were wrong after August 2007, so why should they post losses that they were certain were wrong.”. This point was highlighted by a quote that appeared in The  Economist magazine, in January 2008 in relation to a fraud at the French bank, SocGen
In common with other French banks, SocGen was also thought by many to take  an overly mathematical approach to risk. “ ‘It may work in practice but does it work in theory?’ is the stereotype of a French bank,” says one industry consultant. (‘No Defense’, The Economist, 31 January 2008.)

The bankers at J.P. Morgan, along with French risk–managers, kept in mind Hume’s observation  that “it is never contradictory to deny matter of fact”. Bankers, like all scientists, must use their intellect and constantly ask themselves the questions ‘why’ and ‘how’ to give them the foresight not to act recklessly.

The common thread linking financial crises since Bretton-Woods, Black Monday, the Equitable,  the super–portfolio that bought down LTCM, the tech–bubble of 2000 and the Credit Crisis was not the collapse of Bretton–Woods but the adoption of standardised approaches to finance. Had the majority of traders, bankers and regulators thought like mathematicians, or French risk managers, and asked themselves, in a Cartesian manner, “how do I know what I think I know is true?”, then the crises might have been avoided. Banning speculative trading in deriviatives is simplistic, and not the answer.

References


   
P. Ackroyd. London: The Biography. Vintage, 2001.

   
FCIC. The Financial Crisis Inquiry Report. Technical report, The National Commission
on the Causes of the Financial and Economic Crisis in the United States, 2011.

   
D. MacKenzie. An Engine, Not a Camera: How Financial Models Shape Markets. The
MIT Press, 2008.

   
A. L. Murphy. The Origins of English Financial Markets. Cambridge University Press,
2009.

   
G. Poitras. The Early History of Financial Economics, 1478–1776. Edward Elgar, 2000.

   
G. Tett. Fools’ Gold. Little Brown, 2009.