Tuesday 29 April 2014

When being wrong is right

I attended the Mathematical Cultures workshop/conference a the London Mathematical Society at the start of the month where I presented Is fairness a mathematical concept in the context of how the culture of mathematics contributed to the Financial Crisis.  Specifically, I began my talk with this quote from the UK's Parliamentary Commission on Banking Standards Report (Volume 2)
89. The Basel II international capital requirements regime allowed banks granted “advanced status” by the regulator to use internal mathematical models to calculate the risk weightings of assets on their balance sheets. Andy Haldane described this as being equivalent to allowing banks to mark their own examination papers.  A fog of complexity enabled banks to con regulators about their risk exposures:
[...] unnecessary complexity is a recipe for […] ripping off […], in the pulling of the wool over the eyes of the  regulators about how much risk is actually on the balance sheet, through complex models
 (my emphasis) 
and I ended it with this clip from a 1999 BBC science program on Black-Scholes-Merton and the collapse of LTCM (the speaker is Paul Samuelson) (the clip starts at 12:20 in the original programme)


My observation is that there is a culture of mathematics being a magical language that unlocks the secrets of the universe.  This is captured in many popular representations of mathematics: in  Darren Aronofsky's film Pi, that links maths, kabbalah and finance

the Da Vinci Code

Marcus Du Sautoy's The Code

even Simon Singh's book on the Simpsons

(for those who don't want to buy the Simpson's book, there is a long standing web-site)

On Saturday, I listened to Bridget Kendall's show 'The Forum'.  Kendall is well known in the UK for having been the BBC's Moscow correspondent during the 1991 coup and rise of Boris Yeltsin, and then the BBC's Washington correspondent.  Less well known, she is the daughter of Britain's most important probabilist of the twentieth century, David Kendall, and her brother is a probabilist at Warwick - I suspect she has a good grasp of mathematics.

Kendall begins the programme by asking Max Tegmark, an MIT cosmologist, about the idea of "maths as a kind of key" in this excerpt starting at 2:35 of the programme.  Tegmark is committed to mathematical realism and believes that ultimately, mathematics enables us to "ultimately predict the future" (at 0:55 in the except, 3:30 in the original).

I see a strong connection between Haldane's criticism of the use of maths in finance and Samuelson's and Tegmark's mathematical realism.  To paraphrase William Tait's description of mathematical realism:
A mathematical proposition  is about a certain structure, financial markets. It refers to prices and relations among them. If it is true, it is so in virtue of a certain fact about this structure. And this fact may obtain even if we do not or cannot know that it does.
The maths makes the financial theory true.

Following the meeting Heather Mendick shared a post where she contrasts Will Smith being constrained by the mathematical fact that 2+2=4,

with Orwell's presentation that "Freedom is the freedom to say that two plus two equals four".

Heather sees Smith's approach as being an example of "extreme neoliberalism", (I think she means extreme liberalism), but I argued that there is a connection between Orwell's "freedom" to claim 2+2=4 in the context of empiricism and Will Smith's (I assume, metaphorical) rejection of 2+2=4 as a rejection of imposed authority.

What struck me when listening to The Forum was how the discussion developed immediately after the except above.  In this excerpt, starting at 3:41, Kendall asks Tegmark to explain how  accepting the rules of mathematics leads to a belief in parallel universes.  I have recently shared my views on multi-verses, and my belief is that they are a fiction resulting from an incomplete understanding of the mathematical physics.  Tegmark supports my view that multi-verses are a mathematical consequence but his Platonism forces him to believe in multi-verses.  In this context, comparing Will Smith to Max Tegmark, who is wrong in rejecting mathematical truths.  We all recognise Smith is wrong, even probably Smith, but at the same time it is difficult to reject what Tegmark argues, despite the fact that Kendall comments that to the "unititiated" it all looks like make-believe (at 2:30 in the except, 6:11 in the original).

We label Smith as "ignorant" for presenting the claim that 2+2=5 in the context of advocating people reject the status quo, but we are expected to believe in multiverses.  Haldan'e criticism (probably) referred to the London Whale, where by presenting mathematics traders at J.P. Morgan reduced the liability of a portfolio from $22 billion to $13 billion (I may be inaccurate here, but the order of magnitude is correct).  To the "uninitiated" it all looked like make believe, but, like multi-verses, it carries the authority of mathematics.

Something that always appealed tome about science is its iconoclasm: science is right because it is happy to destroy its icons and move on.  We criticise Smith but venerate Galileo for rejecting the Ptolomaic system, despite the fact he the Dialogue is spectacularly wrong on the tides (e.g. Aiton, Brown).  Galileo is wrong, as the Inquisition realised, but he was right.

If economics is real in the Platonic sense, and can be described by exact mathematical equations, it will eventually grind to a halt.  This was the concern of many economists in the first half of the twentieth century, Knight, Keynes and the Austrians all agreed on the significance of uncertainty and unpredictability in determining economic affairs.  The organiser of the Mathematical Cultures conference, the philosopher Brendan Larvour, remarked
a repsonse was
My response is that Piketty, a French economist, is mathematically sophisticated, but in a different way to that of the realists/Platonists like Samuelson and Tegmark.  My disquieting suggestion is that in order to prevent Financial Crises we really need to examine how tenable mathematical realism, and the science built on this basis, is.

PS A Tweet from a French mathematician quoting another section of the Guardian

Friday 4 April 2014

The Legitimacy of High Frequency Trading

Mark Thoma brought my attention to a post by Dean Baker, High Speed Trading and Slow-Witted Economic Policy.  High Frequency Trading, or more generically Computer Based Trading, is proving problematic because it is a general term involving a variety of different techniques, some of which appear uncontroversial, others appear very dubious.

For example, a technique I would consider legitimate derives from Robert Almgren and Neil Chriss' work on optimal order execution: how do you structure a large trade such that it has minimal negative price impact and low transaction costs.  There are firms that now specialise in performing these trades on behalf of institutions and I don't think there is an issue with how they innovate in order to generate profits.

The technique that is most widely regarded as illegitimate is order, or quote, stuffing.  The technique involves placing orders and within a tenth of a second or less, cancelling them if they are not executed.  I suspect this is the process that Baker refers to that enables HFTs to 'front run' the market.  Baker regards the process as illegitimate with the argument that
The issue here is that people are earning large amounts of money by using sophisticated computers to beat the market. This is effectively a form of insider trading. Pure insider trading, for example trading based on the CEO giving advance knowledge of better than expected profits, is illegal. The reason is that it rewards people for doing nothing productive at the expense of honest investors.
 On the other hand, there are people who make large amounts of money by doing good research to get ahead of the market. ... The gains to the economy may not in all cases be equal to the private gains to these traders, but at least they are providing some service.
By contrast, the front-running high speed trader, like the inside trader, is providing no information to the market. They are causing the price of stocks to adjust milliseconds more quickly than would otherwise be the case. It is implausible that this can provide any benefit to the economy. This is simply siphoning off money at the expense of other actors in the market.
The problem I have with Baker's argument is that I do not think it is robust.  It starts by suggesting a link between insider trading and HFT.  I don't think this holds up.  When a trade is placed on an exchange, it becomes public information.  The HFTs are making their profits by responding more quickly to the information, not because they are working on private information.  Baker distinguishes one sort of  'research', traditional economic research, from another, novel research on computer networks and algorithms, and implies that traditional research has a legitimacy in market exchange that computer research does not.

Statements like "simply siphoning off money at the expense of other actors in the market" make me a bit uneasy because they create distinctions between 'legitimate' and 'illegitimate' activity without offering a clear basis for the distinction.  For me, the distinction Baker makes seems to be on the intellectual basis of the agents: in economics or computer sciences.  I worry that the foundation of Baker's criticism is an affinity with institutional investors and a distaste for small scale entrepreneurs.

Baker's solution of  "A modest tax on financial transactions [that] would make this sort of rapid trading unprofitable" is, if my basic economic understanding is correct, a standard way incumbents create barriers to new entrants.  Wall Street, according to Jonathan Levy's Freaks of Fortune, has at least a hundred year tradition of lobbying legislatures to protect its interests and I think we should be wary of whether Wall Street's interests are aligned to the broader public.

The problem is somewhat more serious in the UK.  In 2012 the UK's Government Office of Science reviewed Computer Based Trading technologies and decided that, while acknowledging that order stuffing was dubious, they would not suggest inhibiting it.  The rational was that the market place was a competitive arena and that traders would congregate at exchanges that enabled competition; i.e. for the UK to retain its position as a financial centre the UK government should not legislate on the issue.

The substantive question is whether I can come up with a more robust argument than Baker's, and I offer an argument at the bottom of this piece.

I have been critical of the Foresight report.  However I have also been concious that I could not coherently justify my objections to practices such as order stuffing.  This concern was related to my uneasiness around identifying the concept of reciprocity being embedded in contemporary financial mathematics.  I have come from a fairly orthodox background and connecting mathematics and ethics was a problem for me since I first identified a link around 2010.

For me, the intellectual resolution of the problem of linking mathematics and ethics comes from pragmatic philosophy.  Pragmatism is especially relevant to finance because it addresses the thorny issue of truth when we cannot rely on objectivity, neutrality and determinism and because it acknowledges the role of ethics in science. Specifically, by rejecting the ideology of the fact/value dichotomy, I claim that  the principle of ‘no arbitrage’ in pricing contingent claims is infused with the moral concept of fairness.  This is all well and good, but the claim can be treated as a heuristic (as the Dutch Book argument is) or as a fact.  Based on the empirical evidence of the Ultimatum Game, I claim it is a fact that reciprocity is embedded in financial mathematics.  This raises the question of why is reciprocity important.

As well as justifying the connection between ethics and mathematics,pragmatism provides an explanatory hypothesis.  one problem I grappled with was why did the link between reciprocity and finance become obscured between the eighteenth and twenty-first centuries.  The explanation comes in the theories developed in Adorno and Horkenheimer's Dialectic of Enlightenment  or Polyani's The Great Transformation, both published in 1944. The Dialectic claims that the Enlightenment led to the objectification of nature and its mathematisation, which in turn leads to ‘instrumental mindsets’ that seek to optimally achieve predetermined ends in the context of an underlying need to control external events. Jurgen Habermas responded to the Dialectic in  Structural Transformation of the Public Sphere where he argues that during the seventeenth and eighteenth centuries public spaces emerged, the public sphere, which facilitated rational discussion that sought the truth in support of the public good. In the nineteenth century mass circulation mechanisms came to dominate the public sphere and these were controlled by private interests. As a consequence, the public became consumers of news and information rather than creators of a consensus through engagement with information. Having undertaken this analysis of the contemporary state of affairs, Habermas sought to describe how the ideal of the Enlightenment public sphere could be enacted in the more complex contemporary (pre-internet) society and his Theory of Communicative Action was the result.

Central to Communicative Action is a rejection of the dominant philosophical paradigm, the ‘philosophy of consciousness’ that is rooted in Cartesian dualism; the separation of mind-body, subject-object, concepts and is characterised by Foundationalism; philosophy is required in order to demonstrate the validity of science and the validity of science is based on empiricism, and certain views specific to the social sciences; such as that society is based on individuals (atoms) interacting, so that society is posterior to individuals and that society (a material, extending the physical metaphor) can be studied as a unitary whole, not as an aggregate of individuals.

The dominant paradigm sees language as being made up of statements that are either true or false and complex statements are valid if they can be deduced from true primitive statements. This approach is exemplified in the standard mathematical technique of axiom-theorem-proof. Habermas replaces this paradigm with one that rests on a Pragmatic theory of meaning that shifts the focus from what language says (bears truth) to what it does. Specifically, Habermas sees the function of language as being to enable different people to come to a shared understanding and achieve a consensus, this is defined as discourse. Because discourse is based on making a claim, the claim being challenged and then justified, discourse needs to be governed by rules, or norms. The most basic rules are logical and semantic, on top of these are norms governing procedure, such as sincerity and accountability, and finally there are norms to ensure that discourse is not subject to coercion or skewed by inequality.

I have come to the conclusion that markets are centres of communicative action enabled by the language of mathematics.  In this framework reciprocity is a norm of communication, but it is not the only norm.  Habermas emphasises the importance of sincerity in communication in general, and the implication is that it is required in markets.

It is on this basis that I believe we can identify order stuffing as illegitimate: it is insincere.  The difference between optimal order execution strategies, which earn their computer scientist experts money, and order stuffing is that a HFT order stuffing is not "sincere" in issuing an order they immediately cancel.  The antidote is not to impose an additional cost on transactions, that would not affect institutional investors but might hinder legitimate speculation and innovation, but to regulate the timing of order cancellations: order stuffing would not be possible if orders had to remain on the book for a few minutes.