Wednesday, 26 September 2012

The Fundamental Theory of Asset Pricing

This is now published, open access.

Within the field of Financial Mathematics, the Fundamental Theorem of Asset Pricing consists of two statements, (e.g. [Shreve2004, Section 5.4])

Theorem: The Fundamental Theorem of Asset Pricing
1. A market admits no arbitrage, if and only if, the market has a martingale measure.
2. The martinagale measure is unique, if and only if, every contingent claim can be hedged.

The theorem emerged between 1979 and 1983 ([Harrison and Kreps1979], [Harrison and Pliska1981],[Harrison and Pliska1983]) as Michael Harrison sought to establish a mathematical theory underpinning the well established Black-Scholes equation for pricing options. One remarkable feature of the Fundamental Theorem is its lack of mathematical notation, which is highlighted by the use of mathematical symbols in the Black-Scholes equation, which came out of economics. Despite its non-mathematical appearance, the work of Harrison and his collaborators opened finance to investigation by functional analysts (such as [Schachermayer1984]) and by 1990, any mathematician working on asset pricing would have to do so within the context of the Fundamental Theorem.

 The use of the term ‘probability measure’ places the Fundamental Theory within the mathematical theory of probability formulated by Andrei Kolmogorov in 1933 ([Kolmogorov1933 (1956)]). Kolmogorov’s work took place in a context captured by Bertrand Russell, who in 1927 observed that
It is important to realise the fundamental position of probability in science. …As to what is meant by probability, opinions differ. Russell [1927 (2009), p 301]
The significance of probability in providing the basis of statistical inference in empirical science had been generally understood since Laplace. In the 1920s the idea of randomness, as distinct from a lack of information, the absence of Laplace’s Demon, was becoming significant. In 1926 the physicist Max Born was “inclined to give up determinism”, to which Einstein responded with “I for one am convinced that [God] does not play dice” [von Plato1994, pp 147–157]. Outside the physical sciences, Frank Knight, in Risk, Uncertainty and Profit, argued that uncertainty, a consequence of randomness, was the only true source of profit, since if a profit was predictable the market would respond and make it disappear (Knight [1921 (2006, III.VII.1–4]). Simultaneously, in his Treatise on Probability, John Maynard Keynes observed that in some cases cardinal probabilities could be deduced, in others, ordinal probabilities, one event was more or less likely than another, could be inferred, but the largest class of problems were not reducible to the conventional concept of probability ([Keynes1972, Ch XXIV, 1]. Keynes would place this inability to precisely define a numerical probability at the heart of his economics ([Skidelsky2009, pp 84–90]).

Two mathematical theories had become ascendant by the late 1920s. Richard von Mises, an Austrian engineer linked to the Vienna Circle of logical-positivists, and brother of the economist Ludwig, attempted to lay down the axioms of probability based on observable facts within a framework of Platonic-Realism. The result was published in German in 1931 and popularised in English as Probability, Statistics and Truth and is now regarded as a key justification of the frequentist approach to probability.

To balance von Mises’ Realism, the Italian actuary, Bruno de Finetti presented a more Nominalist approach. De Finetti argued that “Probability does not exist” because it was only an expression of the observer’s view of the world. De Finetti’s subjectivist approach was closely related to the less well-known position taken by Frank Ramsey, who, in 1926, published Probability and Truth, in which he argued that probability was a measure of belief. Ramsey’s argument was well-received by his friend and mentor John Maynard Keynes but his early death hindered its development.

While von Mises and de Finetti took an empirical path, Kolmogorov used mathematical reasoning to define probability. Kolmogorov wanted to adress they key issue for physics at the time which was that was that, following the work of Montmort and de Moivre in the first decode of the eighteenth century, probability had been associated with counting events and comparing relative frequencies. This had been coherent until mathematics became focused on infinite sets at the same time as physics became concerned with statistical mechanics in the second half of the nineteenth century. Von Mises had tried to address these issues but his analysis was weak in dealing with infinite sets, that came with continuous time. As Jan von Plato observes
von Mises’s theory of random sequences has been remembered as something to be criticized: a crank semi-mathematical theory serving as a warning of the state of probability [at the time] von Plato [1994, p 180]

In 1902 Lebesgue had redefined the mathematical concept of the integral in terms of abstract ‘measures’ in order to accommodate new classes of mathematical functions that had emerged in the wake of Cantor’s transfinite sets. Kolmogorov made the simple association of these abstract measures with probabilities, solving the von Mises’ issue of having to deal with infinite sets in an ad hoc manner. As a result Kolmogorov identified a random variable with a function and an expectation with an integral, probability became a branch of Analysis, not Statistics.

Kolmogorov’s work was initially well received, but slow to be adopted. One contemporary American reviewer noted it was an important proof of Bayes’ Theorem ([Reitz1934]), then still controversial (Keynes [1972, Ch XVI, 13]) but now a cornerstone of statistical decision making. Amongst English-speaking mathematicians, the American Joseph Doob was instrumental in promoting probability as measure ([Doob1941]) while the full adoption of the approach followed its advocacy by Doob and William Feller at the First Berkeley Symposium on Mathematical Statistics and Probability in 1945–1946.

While measure theoretic probability is a rigorous theory outside pure mathematics it is seen as redundant. Von Mises criticised it as un-necessarily complex ([von Mises1957 (1982), p 99]) while the statistician Maurice Kendall argued that measure theory was fine for mathematicians, but of limited practical use to statisticians and fails “to found a theory of probability as a branch of scientific method” ([Kendall1949, p 102]). More recently the physicist Edwin Jaynes champions Leonard Savage’s subjectivism as having a “deeper conceptual foundation which allows it to be extended to a wider class of applications, required by current problems of science” in comparison with measure theory ([Jaynes2003, p 655]). Furthermore in 2001 two mathematicians Glenn Shafer and Vladimir Vovk, a former student of Kolmogorov, proposed an alternative to measure-theoretic probability, ‘game-theoretic probability’, because the novel approach “captures the basic intuitions of probability simply and effectively” ([Shafer and Vovk2001]). Seventy-five years on Russell’s enigma appears to be no closer to resolution.

The issue around the ‘basic intuition’ of measure theoretic probability for empirical scientists can be accounted for as a lack of physicality. Frequentist probability is based on the act of counting, subjectivist probability is based on a flow of information, where as measure theoretic probability is based on an abstract mathematical object unrelated to phenomena. Specifically in the Fundamental Theorem, the ‘martingale measure’ is a probability measure, usually labelled , such that the price of an asset today, X0 is the expectation, under the martingale measure, of the discounted asset prices in the future, XT
Given a current asset price X0, and a set of future prices, XT the probability distribution is defined such that this equality holds, and so is forward looking, in the fact that it is based on current and future prices. The only condition placed on the relationship that the martingale measure has with the ‘natural’, or ‘physical’, probability measure, inferred from historical price changes and usually assigned the label , is that they agree on what is possible.

The term ‘martingale’ in this context derives from doubling strategies in gambling and it was introduced into mathematics by Jean Ville in 1939, in a critique of von Mises work, to label a random process where the value of the random variable at a specific time is the expected value of therandom variable in the future. The concept that asset prices have the martingale property was first proposed by Benoit Mandlebrot ([Mandelbrot1966]) in response to an early formulation of Eugene Fama’s Efficient Market Hypothesis (EMH) ([Fama1965]), the two concepts being combined by Fama in 1970 ([Fama1970]). For Mandelbrot and Fama the key consequence of prices being martingales was that the price today was, statistically, independent of the future price distribution: technical analysis of markets was charlatanism. In developing theEMH there is no discussion on the nature of the probability under which assets are martingales, and it is often assumed that the expectation is calculated under the natural measure.

Arbitrage, the word derives from ‘arbitration’, has long been a subject of financial mathematics. In Chapter 9 of his 1202 text advising merchants, the Liber Abaci, Fibonacci discusses ‘Barter of Merchandise and Similar Things’,
20 arms of cloth are worth are worth 3 Pisan pounds and 42 rolls of cotton are similarly worth 5 Pisan pounds; it is sought how many rolls of cotton will be had for 50 arms of cloth. ([Sigler2002, p 180])

In this case there are three commodities, arms of cloth, rolls of cotton and Pisan pounds, and Fibonacci solves the problem by having Pisan pounds ‘arbitrate’ between the other two commodities.

Over the centuries this technique of pricing through arbitration evolved into the law of one price, that if two assets offer identical cash flows then they must have the same price. This was employed by Jan de Witt in 1671 when he solved the problem of pricing life annuities in terms of redeemable annuities, based on the presumption that
the real value of certain expectations or chances of objects, of different value, should be estimated by that which we can obtain from as many expectations or chances dependent on one or several equitable contracts. [Sylla2003, p 313, quoting De Witt, The Worth of Life Annuities in Proportion to Redeemable Bonds]

In 1908 the Croatian mathematician, Vincent Bronzin, published a text which discusses pricing derivatives by ‘covering’, or hedging them, them with portfolios of options and forward contracts employing the principle of ‘equivalence’, the law of one price ([Zimmermann and Hafner2007]). In 1965 the functional analyst and probabilist, Edward Thorp, collaborated with a post-doctoral mathematician, Sheen Kassouf, and combined the law of one price with basic techniques of calculus to identify market mis-pricing of warrant prices, at the time a widely traded stock option. In 1967 they published their methodology in a best-selling book, Beat the Market ([MacKenzie2003]).

Within economics, the law of one price was developed in a series of papers between 1954 and 1964 by Kenneth Arrow, Gerard Debreu and Lionel MacKenzie in the context of general equilibrium. In his 1964 paper, Arrow addressed the issue issue of portfolio choice in the presence of risk and introduced the concept of an Arrow Security, an asset that would pay out ‘1’ in a specific future state of the economy but zero for all other states, and by the law of one price, all commodities could be priced in terms of these securities ([Arrow1964]). The work of Fischer Black, Myron Scholes and Robert Merton ([Black and Scholes1973]) employed the principal and presented a mechanism for pricing warrants on the basis that “it should not be possible to make sure profits” with the famous Black-Scholes equation being the result.

In the context of the Fundamental Theorem, ‘an arbitrage’ is the ability to formulate a trading strategy such that the probability, whether under or , of a loss is zero, but the probability of a profit is positive. This definition is important following Hardie’s criticism of the way the term is applied loosely in economic sociology ([Hardie2004]). The obvious point of this definition is that, unlike Hardie’s definition [Hardie2004, p 243], there is no guaranteed (strictly positive) profit, however there is also a subtle technical point: there is no guarantee that there is no loss if there is an infinite set of outcomes. This is equivalent to the observation that there is no guarantee that an infinite number of monkeys with typewriters will, given enough time, come up with a work of Shakespeare: it is only that we expect them to do so. This observation explains the caution in the use of infinite sets taken by mathematicians such as Poincare, Lebesgue and Brouwer.

To understand this meaning of arbitrage, consider the most basic case of a single period economy, consisting of a single asset whose price, X0, is known at the start of the period and can take on one of two (present) values, XT U > X T D, representing two possible states of the economy at the end of the period. In this case an arbitrage would exist if XT U > X T D X 0, buying the asset now would lead to a possible profit at the end of the period, with the guarantee of no loss. Similarly, if X0 XT U > X T D, short selling the asset now, and buying it back at the end of the period would also lead to an arbitrage.

In summary, for there to be no arbitrage opportunities we require that
This implies that there is a real number, q, 0 q 1 such that
X0 = XT D + q(X T U - X T D)
= qXT U + (1 - q)X T D
E[XT ],
and it can be seen that q represents a measure theoretic probability that the economy ends in the U state.
With this in mind, the first statement of the Fundamental Theorem can be interpreted simply as “the price of an asset must lie between its maximum and minimum possible (discounted) future price”. If X0 > XT D we have that q< 0 where as if X T U<X 0 then q >1, and in both cases q does not represent a probability measure, which, by definition must lie between 0 and 1. In this simple case there is a trivial intuition behind measure theoretic probability, the martingale measure and an absence of arbitrage are a simple tautology.

To appreciate the meaning of the second statement of the theorem, consider the situation when the economy can take on three states at the end of the time period, not two. If we label possible future asset prices as XT U > X T M >X T D, we cannot deduce a unique set of probabilities 0 qU,qM,qD 1, with qU + qM + qD = 1, such that



The market still precludes arbitrage, but we no longer have a unique probability measure under which asset prices are martingales, and so we cannot derive unique prices for other assetsin the market. In the context of the law of one price, we cannot hedge, replicate or cover, a position in the market, making it riskless and in terms of Arrow’s work the market is incomplete. This explains the sense of the second statement of the Fundamental Theorem and is important in that the statement tells the mathematician that in the real world of imperfect knowledge and transaction costs, a model within the Theorem’s framework cannot give a precise price.

Most models employed in practice ignore the impact of transaction costs, on the utopian basis that precision will improve as market structures evolve and transaction costs disappear. Situations where there are many, possibly infinitely many, prices at the end of the period are handled by providing a model for asset price dynamics, between times 0 and T. The choice of asset price dynamics defines the distribution of XT , either under the martingale or natural probability measure, and in making the choice of asset price dynamics, the derivative price is chosen. This effect is similar to the choice of utility function determining the results of models in some areas of welfare economics.

The Fundamental Theorem is not well known outside the limited field of financial mathematics, practitioners focus on the models that are a consequence of the Theorem where as social scientists focus on the original Black-Scholes-Merton model as an exemplar. Practitioners are daily exposed to the imprecision of the models they useand are skeptical, if not dismissive, of the validity of the models they use ([Miyazaki2007, pp 409-410 ], [MacKenzie2008, p 248], [Haugh and Taleb2009]). Following the market crash of 1987, few practioners used the Black-Scholes equation to actually ‘price’ options, rather they used the equation to measure market volatility, a proxy for uncertainty.

However, the status of the Black-Scholes model as an exemplar in financial economics has been enhanced following the adoption of measure theoretic probability, and this can be understood because the Fundamental Theorem, born out of Black-Scholes-Merton, unifies a number of distinct theories in financial economics. MacKenzie ([MacKenzie2003, p 834]) describes a dissonance between Merton’s derivation of the model (Merton [1973]) using techniques from stochastic calculus, and Black’s, based on the Capital Asset Pricing Model (CAPM) (Black and Scholes [1973]). When measure theoretic probability was introduced it was observed that the Radon-Nikodym derivative, a mathematical object that describes the relationship between the stochastic processes Merton used in the natural measure and the martingale measure, involved the market-price of risk (Sharpe ratio), a key object in the CAPM. This point was well understood in the academic literature in the 1990s and was introduced into the fourth edition of the standard text book, Hull’s Options, Futures and other Derivatives, in 2000.

The realisation that the Fundamental Theorem unified Merton’s approach, based on stochastic calculus advocated by Samuelson at M.I.T, CAPM, which had been developed at the Harvard Business School and in California, martingales, a feature of efficient markets that had been proposed at Chicago and incomplete markets, from Arrow and Debreu in California, enhanced the status of Black-Scholes-Merton as representing a Kuhnian paradigm. This unification of a plurality of techniques within a ‘theory of everything’ came just as the Black-Scholes equation came under attack for not reflecting empirical observations of market prices and obituaries were being written for the broader neoclassical programme ([Colander2000])and can explain why, in 1997, the Nobel Prize in Economics was awarded to Scholes and Merton “for a new method to determine the value of derivatives”.

The observation that measure theoretic probability unified a ‘constellation of beliefs, values, techniques’ in financial economics can be explained in terms of the transcendence of mathematics. To paraphrase Tait ([Tait1986, p 341])
A mathematical proposition is about a certain structure, financial markets. It refers to prices and relations among them. If it is true, it is so in virtue of a certain fact about this structure. And this fact may obtain even if we do not or cannot know that it does.
In this sense, the Fundamental Theorem confirms the truth of the EMH, or any other of the other ‘facts’ that go into the proposition. It becomes doctrine that more (derivative) assets need to be created in order to complete markets, or as Miyazaki observes [Miyazaki2007, pp 404 ], speculative activity as arbitration, is essential for market efficiency.

However, this relies on the belief in the transcendence of mathematics.  If mathematics is a human construction, it does not hold true.


References


   K. J. Arrow. The role of securities in the optimal allocation of risk-bearing. The Review of Economic Studies, 31(2):91–96, 1964.
   F. Black and M. Scholes. The pricing of options and corporate liabilities. Journal of Political Economy, 81(3):637–654, 1973.
   D. Colander. The death of neoclassical economics. Journal of the History of Economic Thought, 22(2):127, 2000.
   J.L. Doob. Probability as measure. The Annals of Mathematical Statistics, 12(2):206–214, 1941.
   E. F. Fama. The behavior of stock–market prices. The Journal of Business, 38(1):34–105, 1965.
   E. F. Fama. Efficient capital markets: A review of theory and empirical work. The Journal of Finance, 25(2):383–417, 1970.
   I. Hardie. ‘The sociology of arbitrage’: a comment on MacKenzie. Economy and Society, 33(2):239–254, 2004.
   J. M. Harrison and D. M. Kreps. Martingales and arbitrage in multiperiod securities markets. Journal of Economic Theory, 20:381–401, 1979.
   J. M. Harrison and S. R. Pliska. Martingales and stochastic integrals in the theory of continuous trading. Stochastic Processes and their Applications, 11:215–260, 1981.
   J. M. Harrison and S. R. Pliska. A stochastic calculus model of continuous trading: complete markets. Stochastic Processes and their Applications, 15:313–316, 1983.
   E. G. Haugh and N. N. Taleb. Why we have never used the Black–Scholes–Merton option pricing formula. 2009.
   E. T. Jaynes. Probability Theory: The Logic of Science. Cambridge University Press, 2003.
   M. G. Kendall. On the reconciliation of theories of probability. Biometrika, 36(1/2): 101–116, 1949.
   J. M. Keynes. The collected writings of John Maynard Keynes. Vol. 8 : Treatise on probability. Macmillian, 1972.
   F. H. Knight. Risk, Uncertainty, and Profit. Hart, Schaffner & Marx (Cosimo), 1921 (2006).
   A. N. Kolmogorov. Foundations of the Theory of Probability. Julius Springer (Chelsea), 1933 (1956).
   D. MacKenzie. An equation and its worlds: Bricolage, exemplars, disunity and performativity in financial economics. Social Studies of Science, 33(6):831–868, 2003.
   D. MacKenzie. An Engine, Not a Camera: How Financial Models Shape Markets. The MIT Press, 2008.
   M. S. Mahoney. The Mathematical Career of Pierre de Fermat, 1601–1665. Princeton University Press, 1994.
   B. Mandelbrot. Forecasts of future prices, unbiased markets and ”martingale” models. The Journal of Business, 39(1, Supplement on Security Prices):242–255, 1966.
   R. C. Merton. Theory of rational option pricing. The Bell Journal of Economics and Management Science, 4(1):141–183, 1973.
   H. Miyazaki. Between arbitrage and speculation: an economy of belief and doubt. History of Political Economy, 36(3):369–415, 2007.
   H.L. Reitz. Review of Grundbegriffe der Wahrscheinlichkeitsrechnung. Bulletin of the American Mathematical Society, 40(7):522–523, 1934.
   B. Russell. An Outline of Philosophy. George Allen & Unwin (Routledge), 1927 (2009).
   W. Schachermayer. Die Uberprufung der Finanzierbarkeit der Gewinnbeteiligung. Mitteilungen der Aktuarvereinigung Osterreichs, 2:13–30, 1984.
   G. Shafer and V. Vovk. Probability and Finance: It’s Only a Game! Wiley, 2001.
   S. E. Shreve. Stochastic Calculus for Finance II: Continuous-Time Models. Springer, 2004.
   L. E. Sigler. Fibonacci’s Liber Abaci. Springer-Verlag, 2002.
   R. Skidelsky. Keynes, The Return of the Master. Allen Lane, 2009.
   E. D. Sylla. Business ethics, commercial mathematics, and the origins of mathematical probability. History of Political Economy, 35:309–337, 2003.
   W. W. Tait. Truth and proof: The Platonism of mathematics. Synthese, 69(3):341–370, 1986.
   R. von Mises. Probability, statistics and truth. Allen & Unwin (Dover), 1957 (1982).
   J. von Plato. Creating Modern Probability. Cambridge University Press, 1994.
   H. Zimmermann and W. Hafner. Amazing discovery: Vincenz Bronzin’s option pricing models. Journal of Banking and Finance, 31:531–546, 2007.

Thursday, 13 September 2012

Parade's End

The BBC's transmission of Ford Maddox Ford's Parade's End (a co-production with HBO, and adapted by Tom Stoppard who has a good appreciation of mathematics - Rosencrantz and Guildenstern Are Dead) reminded me that the central character, Christopher Tietjens was an actuary, probably the most famous actuary in English literature.

I came to Ford through his collaborator Joseph Conrad, a teenage interest with Coppola's Apocalypse Now led me to Conrad and a love of sailing inspired me to read all his novels while an undergraduate. I read the first two books of Parade's End after graduating.

I always thought Ford made Tietjens an actuary to highlight his fidelity, his trustworthiness.  Statistics provides the foundation for our belief, our faith, in science, that is why Bertrand Russell (following Poincare) observed
It is important to realise the fundamental position of probability in science. ... As to what is meant by probability, opinions differ (p 301, An Outline of Philosophy)
around the same time Ford was writing Parade's End.  So, in making Tietjens an actuary, and the Second Wrangler of his year, Ford is emphasising Tietjen's  faithfulness, which is most obvious in his relations with his adulterous wife, Sylvia, and the more compatible Valentine Wannop.  Tietjen's character is magnified by placing him alongside the less virtuous, but more successful, MacMaster.

The trajectory of Tietjen's career, the trauma of serving at the front impacted his work as a mathematician,  echoes the real life experience of  Émile Borel.  Borel was the star of his generation of French mathematicians.  His 1894 thesis laid the foundations for modern probability theory and within 16 years he had been appointed Deputy-Director of the most prestigious of the French 'grandes  écoles' the École normale supérieure.  Borel served in the war, but more significanltly, his adopted son was killed at the front.  After the war, Borel pre-empted von Neumann's work in Game Theory and established the Institut de Statistiques de l'Université de Paris, but after 1924 abandoned mathematics for politics, serving as a minister and then, in his seventies was active in the Resistance.  In his lifetime ha had been awarded the Croix de Guerre, Médaille de la Résistance with rosette, and the Grand Croix Légion d'Honneur.

When thinking about this, I remembered that Tietjens, as a statistician, had a series of battles with government regarding the manipulation of figures.  This suggests that not much has changed, with the government still being accused of doctoring the numbers, and modern statisticians still worry about public faith in their figures.

However, in one respect the situation for mathematics now is worse than it was when Parade's End was written.  Would a contemporary author of Ford's prestige choose to make a character a mathematician to emphasise their virtue.  It suggests that the reputation of mathematics has been in decline since the 1920s when Russell and Ford were writing.  Could this be related to G. H. Hardy's 1940 statement
I have never done anything ‘useful’. No discovery of mine has made, or is likely to make, directly or indirectly, for good or ill, the least difference to the amenity of the world. (p 49, A Mathematician's Apology)
Hardy's autobiography is significantly different from Borel's.  Since this time, British (pure) mathematics, which encompasses probability, has abandoned the world that Tietjens and Borel lived in, and isolated itself in academic cloisters, to everyone's detriment.

As a footnote, Hardy, who is credited with introducing continental 'rigour'  (rigour mortis?) , into British mathematics, opposed the Cambridge Mathematical Tripos, on which the Wranglers were selected, because he felt it had ossified British mathematics.  That might have been the case, but the Tripos produced more than just pure mathematicians, it delivered leaders in professions as diverse as the law, medicine, the church, politics, as well as actuaries.