Destructive Destruction? An Ecological Study of High Frequency Trading
Tue, 05 Feb 2013 09:17:23 GMT
By Inigo Wilkins & Bogdan Dragos
This article was originally published at MetaMute.org at http:/
How is High Frequency Trading’s drive to efficiency affecting market dynamics as a whole? In their analysis of the financial arms race, Inigo Wilkins and Bogdan Dragos find that far from beating entropy, algorithmic trading simply redistributes it more unequally than ever
What follows is an account of the concepts of information and noise as they apply to an analysis of high frequency trading according to ‘heterodox economics’. This account will highlight the evolutionary path that has led to the present micro-structure of financial markets and allow for a diagnosis of the contemporary financial ecology. High Frequency Trading (HFT) is a subset of algorithmic trading which works at very low time horizons (100 milliseconds) and requires massive information processing capacities. Following recent developments such as flash crashes and various technical break-downs, it is crucial to unpack the black-box of algorithmic high-frequency trading in order to understand its potential impact on wider social and economic systems. This will entail the application of various scientific theories to the financial domain that extend classical models of reversible functions, and go beyond models based on efficiency and equilibrium, encompassing a much wider class of irreversible transformations.
According to heterodox economics the development of thermodynamics brought an end to the dominance of classical physics in economic theory, in particular the dogma of efficient markets hypothesis, and reversal to equilibrium. It is this significant theoretical upheaval that allows Nicholas Georgescu-Roegen to say that the law of entropy is the basis of the economy of life at all levels.[i] Writing in the turbulent ’70’s, amidst the oil crisis, it became apparent to him that classical economic theory could no longer be an adequate model in addressing the huge task of third world underdevelopment, depletion of natural resources, increasing pollution, overpopulation, etc. In his attempt to deal with these issues, he recognised that the main barrier in repositioning economic theory on new grounds was its reliance on Newtonian mechanics. As the Midnight Notes Collective argue, Enlightenment thought was concomitant with a drive to the extraction of absolute surplus value during the first wave of real subsumption in the industrial revolution.[ii] Georgescu-Roegen advocated the replacement of this idealised paradigm with thermodynamics whose second law (that the entropy of an isolated system tends to a maximum) would offer a much more fruitful theoretical foundation for economics. It should be noted at this point that although economic theory is even now still dominated by reversible models based on the supposition of efficiency and equilibrium at the abstract level, in practical terms thermodynamics had already significantly altered the political economy through the 19th century preoccupation with exhaustion, leading to a 'science of work' that concretised in the Taylorisation of the work-force after WW1.
The true novelty of Georgescu-Roegen's formulation lies in his proposal for a fourth law of thermodynamics, where it is not only energy that is subject to decreasing returns, but also matter; 'friction robs us of available matter'.[iii] He thus identifies an ultimate supply limit of low entropy matter-energy; a 'source of absolute scarcity' consisting of a terrestrial stock and a solar flow.[iv] This should not be understood as a reductionist account capable of explaining the causal structure of everything, but rather the identification of an abstract functional schematic whose explanatory coherence may be supplemented or extended by further theoretical devices. In particular, although thermodynamics can help to describe the conditions of class struggle and the divergence between market valuation mechanisms and the actual value of resources, it cannot account for the lived experiences of the former, and offers no substantial critique of the latter. However, it does allow for the reinsertion of the economic process into much wider physical, chemical and biological processes. For if the entropy law operates at all levels, then one can understand the economic process as a continual exchange between low and high entropy, just as dissipative systems maintain coherence through the reduction of energy gradients. An energy gradient is a differential; such as that between hot and cold, or between disparate prices; whose value can be tapped through the application of work. This is a naturalised view of finance, however it must be clearly stated that such a naturalisation does not entail a valorisation of present economic conditions. Rather, the economy, like the environment, exhibits a high degree of structural and functional redundancy, such that a great number of contingent modes of organisation are possible. Lets be clear here, to say that something is natural is not to say that it is good, after all a tumor is natural. It is just to argue that it is subject to a materialist analysis, without claiming to exhaustively describe all its aspects. Moreover, we must not conflate biological and economic ecologies, but rather treat them in their specificity.
It is useful, at this point, to clarify the distinction between low and high entropy. For the purpose of elaborating an ecological economics, Georgescu-Roegen understands the economy as a process that transforms available free energy into unavailable bound energy, that is to say the exploitation of a gradient. The former may be understood both as specific concentrations of material-energetic structures, such as oil or gold, and the potentiality for value extraction offered by living labour; while the latter is exemplified by waste, pollution, highly diffuse forms of matter-energy such as heat, and those forms of dead labour that no longer afford value extraction.
The economic process is the modulation by which a certain dissipative system maintains itself by continually ‘inputting’ free energy and ‘outputting’ bound energy. This entails a local growth of efficiency, or increase in the throughput of energy, that evolves according to the maximum entropy principle (MaxEnt), where the entropy of the microstates that do not correspond to the successful application of a function or technology are maximised such that the energetic cost is minimised for a given utility.[v] This local reduction of entropy is ‘observer dependent’, however it also necessarily results in an increase in ‘observer independent’ entropy according to the maximum entropy production principle (MEPP).[vi] Effectively this means that biological, technical and economic evolution all lead inevitably towards an amplification of entropy at the environmental level. Nevertheless there is a high degree of contingency that determines the rate of throughput.
Within the field of evolutionary economics the notions of energy and information gradients become essential in understanding the dynamics of socio-economic change. In this sense, a certain abstract evolutionary matrix is common to all open systems, whether physical, chemical, biological or socio-economic.
If there is an energy gradient available, a simple dissipative structure will exploit it. Similarly, if there is an information/knowledge gradient available, a socio-economic structure will grow and develop around this continual process of reduction.[vii]
Ever since Friedrich Hayek and Eugene Fama, information becomes a crucial vector in understanding financial markets. For Hayek, markets are a way of collecting and aggregating available information. Fama understood efficient markets as reacting instantly to new information, thereby unproblematically reflecting all available information.      Samuelson offered a final reinforcement to efficient market hypothesis (EMH) by showing that stock prices follow a random walk. Samuelson’s article ‘’Proof the Properly Anticipated Prices Fluctuate Randomly’’ attempted to show that price changes are unpredictable and random if the market is truly efficient[viii]. As more and more participants enter the market, incoming information is incorporated at a faster pace. Furthermore, differences in planning horizons, and between strategies and expectations are neglected. While we may not agree with the wider premises and conclusions of these economists, it is important to understand economic systems as collective calculating devices that compute transient equilibriums.[ix] Balancing economic, computational and thermodynamic perspectives, markets may be defined as dissipative structures coping in an entropic/noisy environment by reducing both energetic and information gradients. This becomes particularly apparent in modern capitalist economies. The current swarm of financial actors, including human, non-human and hybrid systems, feed off a social production of knowledge and its informational friction. An evolutionary process of variation, coordination and selection, leads to differential levels of fitness and to huge asymmetries in terms of collecting and processing information, and hence to the creation of increasingly complex structures with higher rates of change. Such systems are characterised by non-linear risk situations featuring high interconnectivity and super-spreaders that amplify contagion.
In this sense, the main activity of finance is the bearing of uncertainty, but more precisely the reduction of an energetic and information gradient, fueled by the ever-growing heterogeneity of the market.[x] Financial actors are not only an intermediary between producers and users of information, but they also 'assume a hermeneutic function' of performative interpretation, and moreover occupy the point of overlap between an information network and a liquidity network.[xi] Maintaining itself at that particular juncture, allows the financial intermediary to access and reduce a very steep energy/information gradient. The investment bank therefore sits at the nexus of an informal information marketplace for price-relevant information.[xii] A recent paper by Evangelos Benos and Satchit Sagade has shown that noise is a crucial concept in understanding high frequency trading[xiii]. They differentiate between some HFTs that tend to trade more passively, thereby providing liquidity, and others who primarily trade aggressively, removing liquidity. More passive HFTs are in fact engaging in market making as buys are followed by sells and sells by buys. More aggressive HFTs seem to be following trends, buys are followed by buys and sells are followed by sells[xiv]. The authors also follow a distinction between 'good' volatility, due to integration of new information, and ’excessive’ volatility, where there is a production of noise not related to fundamentals. What they find is that HFTs exhibit up to 30% higher information throughputs, that is to say, they contribute a large amount of both ‘good’ and ‘excessive volatility, more so than the average, non-HFT, trader[xv]. In fact, all types of HFT strategies trade more intensively when there is higher volatility: If higher volatility is suggestive of a higher rate of information arrival, HFTs may be trading proportionally more by exploiting their speed advantage[xvi]
HFTs are in a sense a maximization of the market’s primary function of information processing. They can significantly amplify both price discovery and noise[xvii], the destructive effects of such noise should be investigated from a social welfare perspective. Noise production is inherent to HFTs necessity to end the day flat, i.e. not carrying overnight positions. More efficient price discovery affords greater liquidity provision but its cost is a higher level of ‘excessive’ noise[xviii]. This echoes the opinions of evolutionary and ecological perspectives on algorithmic trading[xix]. In this view, algorithms have both a helpful and harmful effect on price discovery. They help to efficiently incorporate new information but they can also exacerbate short-term trends, leading to more noisy prices.      The growing interaction between heterogeneous actors produces both noise and relevant information. Noise is the normal ‘environment’ of finance, both as a basic resource and toxic byproduct. Market-making HFTs reduce transaction friction and open up the system to new inputs of capital from investors; which generates additional noise/information gradients. This can be understood as the energy potential of the market, where the profit made by HFT is energy extracted from the market [xx].  Our argument is that High Frequency Trades (HFTs) are complex socio-technical systems that thrive both through the production of noise and by the reduction of information gradients, operating at a high rate of throughput and offsetting noise/high-entropy to the wider financial ecology. In order to explain these claims it is necessary to briefly chart the evolution of computing within finance and the subsequent appearance of algorithmic trading. From carrier pigeons and the transatlantic telegraph cable to contemporary ICT, finance has always been a site for intensive technical innovation. This is no surprise, inasmuch as financial actors thrive by accessing and reducing information gradients and exploiting communication inefficiencies.
More recently, the shift from open-outcry face-to-face trading to automated electronic trading has represented a huge leap in efficiency and the reduction of transaction costs. However, even Milton Friedman was aware that there is an 'intrinsic paradox built into the assumption of efficient markets', since efficiency is maintained by detection of inefficiencies, the closer to absolute efficiency the less inefficiencies can be discovered; so the market can never achieve absolute efficiency.[xxi] There is thus a complex dialectical interplay between drives to market efficiency and inefficiency. As the market approaches efficiency, there are less opportunities for arbitrage by informed traders (who gauge the discrepancy between the current price and the fundamental value of the underlying asset), and uninformed 'noise traders' progressively dominate the market.[xxii] This inevitably leads to the inflation of bubbles, with the subsequent collapse to fundamental values (when it is not brought on through market manipulation) occurring in an entirely unpredictable manner. The market thus oscillates asymptotically around the attractor of zero information friction in an incomputably random orbit. While this movement receives its impetus from the dialectical, or apparently co-constitutive relation between efficiency and inefficiency, its trajectory and effects are far from reversible, resulting in the non-dialectical destruction of whole swathes of economic actors largely at the base of steep energy gradients.[xxiii] Witness the wave of repossessions following the sub-prime mortgage crisis, or the assymetric distribution of debt organised by the austerity regime. A point made by Evan Calder Williams following Bordiga's description of capital as 'Murderer of the Dead.'[xxiv]
Ever since the mid-’80s, there has been an incredible growth in the adoption of ICT and algorithms in the marketplace. From the computer terminals that were simply assisting human traders to Electronic Communication Networks, we have seen the emergence of a new financial ecosystem, a highly complex computational matrix.[xxv] ECNs represent a major revolution in financial markets in as much as they contributed to a framework that favoured impersonal efficiency, automation, transparency and higher speeds of execution. Electronic technologies have altered the way in which exchanges, brokers and dealers participate in price discovery[xxvi]. Thanks to ECNs, trading can largely be done without human intervention, in an anonymous manner and with considerably lower trading costs. All of these innovations have attracted new clients and investors, maximizing the information throughput of capital markets. Electronic liquidity has outcompeted previous traditional forms of liquidity provision. Algorithms are no longer tools, but they are active in analysing economic data, translating it into relevant information and producing trading orders.[xxvii] This transition represents a new phase of real subsumption affecting all economic actors and social conditions. That is, if labour relations are reorganised around mechanics in the industrial revolution, then thermodynamics and cybernetics in the last two centuries, the current phase of real subsumption may be understood according to contemporary scientific transformations. This is often called the 'nano-bio-info-cogno revolution', and is based on distributed networks and ‘friction free’ systems (i.e. superconductors, ubiquitous computing). However, the importance of Georgescu-Roegen is his assertion that no such friction-free economy is possible, since the drive to efficiency is limited by the absolute scarcity of low entropy resources and met with a corresponding increase of exhaustion or resistance issuing from labour power.
Neil Johnson et al. identify a ‘robot phase transition’ after 2006 where the sub-millisecond speed and massive quantity of robot-robot interactions exceeds the capacity of human-robot interactions. They argue that operating at such timescales is intrinsically unstable and 'characterized by frequent black swan events with ultrafast durations'.[xxviii] While Nassim Taleb's black swan theory is contentious, the conceptual core may be subtracted from his wider project, and refers to high-impact real contingencies as opposed to the structured randomness that casinos and quantum physics display. Analogous to the well-known effect in systems engineering where small cracks in a fuselage build up to a breaking threshold, financial friction is so high that micro-fractures in the form of mini flash-crashes proliferate, threatening the whole ecology.[xxix] Moreover, through the logic of encapsulated coding they employ, algorithmic trading software platforms are intrinsically open to abusive practices, and represent highly opaque and consequently 'unworkable interfaces'.[xxx]
In order to address the topic of HFT rigorously, we must not conflate the material specificities that define its heterogeneity; distinctions must be made between electronic, program and algorithmic trading, where HFT is a heterogenous subset of all three.[xxxi]      Electronic trading is a wider term, designating all forms of trading in which the transmission is done electronically as opposed to phone, mail or face to face. A series of technological and regulatory innovations have lead to the emergence of new platforms, new electronic liquidity suppliers, but also new types of market orders.      Algorithmic trading is a portion of electronic trading, entailing a higher degree of automation both of execution and decision-making; and leading to optimization of buy and sell orders[xxxii]. Low-latency trading refers to the fastest possible execution and routing of orders, irrespective of trading volume or frequency of trading. This usually entails colocation in an exchange data center. High frequency trading is characterized by a high turnover of capital, and a large number of orders in short time interval. There are also systematic algorithms that perform market making functions, and exchange algorithms that handle order routing[xxxiii]. All of this bears witness to a substantial change in the way trading is being done today. In any case, ‘the universe of computer algorithms is best understood as a complex ecology of highly specialized, highly diverse, and strongly interacting agents.’[xxxiv] Within this line of technical and financial innovations, we can see various types of trading strategies that employ an equally diverse population of market order types. Further, from an ecological perspective one can distinguish between various and execution algorithms, but also ‘predatory’ ones. In other words, we can distinguish between agency algorithms and proprietary algorithms. The first type relates mostly to buy-side investors trying to minimize the market impact of their trades. The second type of algos are used by market makers, hedge funds and various prop desk[xxxv]. Beyond this broad differentiation, it is possible to produce further detailed taxonomies, depending on strategies, time horizons, etc.:
Strategies, markets and regulations co-evolve in competitive, symbiotic or predator-prey relationships as technology and the economy change in the background [xxxvi]
For example, pairs trading strategies (whose computational costs are so high they only took off after the ’80's ICT revolution) unilaterally feed on the predictable price reversals engendered by portfolio balancing, just as short-term strategies prey on their long-term counterparts to the point of extinction.[xxxvii] Certain ‘species’ try to efficiently execute a trade, so as to achieve minimal market impact. They split large orders into smaller packs and execute them at certain time intervals. More evolved versions, like ‘volume-weighted average price’ (VWAP) algorithms, employ complex randomisation functions coupled with econometrics to optimise the size and execution times depending on overall trading volumes.[xxxviii]
     The arms race between liquidity providers and removers follows the co-evolutionary behavior of crypsis, whereby an animal can avoid detection by other animals. Predatory and anti-predatory strategies constantly coevolve as they try to keep up with each other, a prime example could be the 4 billion years old war between viruses and cellular organisms.  Moreover, new ecological niches have emerged in order to obfuscate the execution of large orders, known as 'dark pools'.[xxxix] Dark pools are one such adaptation, in as much as it offers a ‘safe haven’ for large institutional and smaller retail investors. Dark pools attract retail order flow from brokers and exclude well-informed traders, which allows large traders to get a better deal[xl]. There are other types who try to profit from identifying and anticipating such trades, the algorithms sometimes referred to as ‘predatory’.[xli] Perhaps the best example of such a frequency-dependent evolutionary path, one that is well-known for its compulsive non-adaptive drive, is the proliferation of low-latency algorithms that profit from the transmission speed differentials inherent in the geography of the globally integrated financial system, and the material transformations these informational relations entail.[xlii]
Such strategies of camouflage, mimesis and deception are endemic in predator-prey relationships, fuelling a run-away propagation of non-adaptive mutations according to the non-equilibrium dynamics of the 'Red Queen Effect', and are modelled in evolutionary game theory as crypsis.[xliii] In his discussion of the pathological tendencies of technological capitalism Ray Brassier cites Roger Callois' investigation of thanatropic mimicry, pointing out that such effects are irreducible to equilibrium models of dialectical resolution, and may often terminate in non-dialectical self-destruction.[xliv] ‘In mimicking their own food,’ Brassier writes, ‘leaf insects such as the Phyllium frequently end up devouring each other’[xlv] He continues, some pages later:
Enlightenment consummates mimetic reversibility by converting thinking into algorithmic compulsion: the inorganic miming of organic reason. Thus the artificialization of intelligence, the conversion of organic ends into technical means and vice versa, heralds the veritable realization of second nature […] in the irremediable form wherein purposeless intelligence supplants all reasonable ends.[xlvi]
Global finance can be seen as the staging ground for a continual redistribution of energy and information gradients; HFT is a prime example of this kind of evolutionary landscape. At a high enough level of liquidity, information friction and disparity allow for the emergence of computationally intensive systems that can effectively reduce gradients and extract 'rents'.[xlvii] While it is true that HFT accounts for a large part of market transactions, the profits are not the most significant among market participants. The generally accepted view is that the move towards electronic and algorithmic trading have reduced bid-ask spreads, decreased volatility and added to market depth[xlviii] and can largely be considered as market makers[xlix], supplying liquidity and achieving market efficiency. Nevertheless, there are others for whom the issue is still under debate[l].      What is important to remember is that HFT is basically a low risk, low reward activity, in as much as they depend on returns from a large volume of small trades. HFTs have a small capital base and can’t accumulate large positions. They provide a crucial service of taking the opposite side of long-term investors, matching trades that come to the market at different moments in time. For all these reasons, it can be argued that HFTs only take advantage of split second opportunities, while long-term investors literally don't have the capacity for those kind of opportunities. In the end, all of the ‘bigger’ actors tolerate low-latency trading firms because they provide much needed liquidity. Nevertheless, HFT exists because at certain volumes of trading, they enjoy a systematic advantage, which is the result of a ‘technicality’ of trading that is opaque to outsiders.[li] They manage to ‘survive’ by exploiting information gradients that ‘slower’ market participants are unable to access.
Nanex: On ... Aug 5, 2011, we processed 1 trillion bytes of data ... This is insane. ... It is noise, subterfuge, manipulation. ... HFT is sucking the life blood out of the markets... [A]t the core, [HFT] is pure manipulation.[lii]
Such reactions might seem dramatic, but they testify to the intense struggle going on in the computational matrix of finance every day. An ecological perspective emphasises the complex interdependencies between different financial ‘species’. Every participant is constantly processing market noise in an attempt to reduce it as much as possible to relevant information. The subsequent decisions and market orders represent more noise for the other participants, that is to say, an irreversible output of high-entropy. As long as there is enough disparity and enough heterogeneity in the market, high-frequency traders can profit from the underlying friction and produce more noise. It is precisely this persistent inefficiency of markets that informs heterodox economics.
Because of bounded rationality, financial traders can’t do everything at once – they tend to specialize. These specialized traders interact with one another as they perform basic functions, such as getting liquidity, offsetting risks and making profits. A given activity can produce profits or losses for another activity. Inefficiencies play the role of food in biological systems, providing profit-making possibilities that support the ecology.[liii]
The interaction of heterogeneous actors with different time horizons and a variety of strategies produces the inefficiencies that make up an information gradient. Ecological economics understands the market as a food web, which can be described in terms of a gain matrix defining the interdependencies between different species. At the bottom there are the basal species – slaves, serfs, proletarians, free labour, consumers, savers, etc. These strata are preyed on by those further up the food chain – pension funds, insurance companies, mutual funds, banks; and they in turn feed more professional financial institutions, such as hedge funds, brokers, investment banks, proprietary trading HFTs, etc. Each financial actor exploits the inefficiencies of the prey species and in the process produces new inefficiencies, further increasing the information gradient. Within this complex ecology there is a gradual stabilisation of predator-prey relationships, but unlike an actual ecosystem, the financial system has a much higher rate of change, leading to more abrupt singular events like flash-crashes evolving according to an accelerated rate of punctuated equilibria, with multiple black swans and mass extinctions.[liv]
During the 2010 flash crash, the Dow Jones Industrial Average lost about 900 points in a few minutes, recovering most of that loss in the subsequent 15 minutes.[lv] To put things into perspective, it represents the wipeout of about $1 trillion in the scope of minutes.[lvi] Following the media frenzy around this event, a variety of market actors have rushed to offer explanations for such a one-sided ‘social’ decision to sell. Part of the explanation lies in a lack of regulatory circuit breakers that would have automatically suspended the free-fall following the abnormally edgy HFT reaction to the discovery of a large 'iceberg' order. From black swans and fat fingers to possible market abuses like quote stuffing (the production of noise in order to obtain a good position in the order book queue), it seems that the causal structure of such events are so complex and opaque that there will never be a definitive explanation. However, we may state with confidence that such occurrences are the kind of irreversible outputs that characterise the hyper-diversity of contemporary socio-technical ecology. Both the SEC-CFTC (2010) report, and the more recent Foresight review have shown that the impetus of the flash crash cannot be traced back to any firm engaged in HFT. Nevertheless, HFT strategies are the present culmination of a tendency towards efficiency of information throughput that inevitably ends up offsetting huge volumes of noise to the wider financial ecology. The question is not so much the good or bad intentions of HFT, but its impact on the resilience and robustness of the overall system.
     The introduction of new species can disrupt this balance, destabilizing an ecology and causing irreversible changessuch as the introduction of rabbits into Australia. Similarly in markets many profit making strategies play an essential role by incorporating information into prices. It would certainly be a gross fallacy to presume that all trading strategies are beneficial; some strategies make substantial profits while having little or no social value, and may lead to negative social impact. Even strategies that normally add social value can be destabilizing[lvii]
Following the sociology of information systems and risk, we could translate this as a result of exchanging high-frequency/low impact events for low-frequency/high impact ones or an exchange between low and high entropy.[lviii] In this sense, any increase in efficiency (throughput) of one part of the system ends up being dissipated to the rest of the system as noise.[lix] If HFT has any part to play in the flash crash, it is because it can be said to represent a real push for efficiency, but one that nevertheless produces unintended consequences for the rest of the financial ecology. In as much as it diminishes the risk of trading through higher matching speeds, HFT allows buyers and sellers to reduce their transaction costs considerably. But the reduction of risk is not actually a reduction as such, and must be understood as a redistribution, or a parametrisation of the fitness landscape of the financial ecology. HFT is just one element of market microstructure and its impact, either positive or negative, on the overall economy is limited. Nevertheless, it might help us understand broader tendencies within contemporary capitalist societies in terms of the distribution of risk and rewards. While the occupants of prime positions on the energy matrix loll around in a rich bath of 'liquidity', an increasing number are forced to pay for this exuberance with their jobs, their homes, and ultimately their lives. Ray Kurzweil's overzealous enthusiasm for the coming 'singularity', when human 'intelligence' is eclipsed by machines, appears willfully myopic when we witness the potential effects of the 'robot phase transition'.
Phenomena such as flash crashes are the inevitable outputs of a financial ecology that tends towards the non-linear emergence of noise saturation peaks. At such critical points of friction, something is bound to break. This does not simply apply to market crashes.[lx] The present financial ecology maintains an unsustainable rate of throughput and a thanatropic mode of crypsis in the proliferation of strategies for digital subterfuge. In order to address the critical situation of contemporary finance, several traditional beliefs must be overcome: trust in the absolute efficacy of competitive market mechanisms for computing equilibriums, such as the valuation of natural resources and labour; confidence in the capacity of finance to self-regulate, or to be merely a question of discovering the regulatory mechanisms for stabilization; and faith in the doctrine of sustainable development, which denies the fourth law of thermodynamics. What HFT teaches us is that although finance tends towards higher efficiency, it can never maintain these states since it feeds off the noise created by information asymmetries and structural inequality, and aggressively maintains these disparities in order to extract value from the resulting ''ecological'' niches.
[i] Heterodox economics comprises behavioral economics, thermo-economics, bio-economics, evolutionary economics and ecological economics; Nicholas Georgescu-Roegen, The Entropy Law and the Economic Process, Cambridge, Massachusetts: Harvard University Press, 1971, p.4.
[ii] Midnight Notes Collective (George Caffentzis, Monty Neill, Hans Widmer, John Willshire), ‘The Work/Energy Crisis and the Apocalypse’, Midnight Notes, Vol. II, #1, 1980.
[iii] Nicholas Georgescu-Roegen, ‘Energy Analysis and Economic Valuation’, Southern Economic Journal,1979, 45, 4: p.1033.
[iv] Paul Burkett, Marxism and Ecological Economics: Toward a Red and Green Political Economy, Brill, 2006. p.145.
[v] The maximum entropy principle is the prime doctrine of Bayesian probability theory, which states that 'the probability distribution which best represents the current state of knowledge is the one with largest information-theoretical entropy.' http:/
[vii] Stanley Metcalfe and John Foster, Evolution and Economic Complexity, Edward Elgar Publishing, 2007, and Economic Emergence: an Evolutionary Economic Perspective, Max Planck Institute of Economics Jena, Evolutionary Economics Group, # 1112, 2011. This statement should not be taken dogmatically however, since as Ostrum demonstrates there are diverse ways in which collective self-organisation can govern common-pool resource problems that effectively reduce or stop gradients from being tapped at a rate that ends in a 'tragedy of the commons'. Elinor Ostrum, Governing the Commons: The Evolution of Institutions of Collective Action, Cambridge University Press, 2008.
[viii] J Doyne Farmer and Andrew Lo, “Frontiers of Finance: Evolution and Efficient Markets”, Proc. Nat. Acad. Sci.,1999, 96, p.9991.
[ix] Michel Callon and Fabian Muniesa,’Les marchés économiques comme dispositifs collectifs de calcul’, Réseaux 21(122), 2003, pp.189-233.
[x] Alan Morrison and William Wilhelm Jr, Investment Banking: Institutions, Politics, and Law, Oxford: Oxford University Press, 2nd Revised edition edition, 2008, p.4.
[xi] Laurence Gialdini and Marc Lenglet, Financial Intermediaries in an Era of Disintermediation: European Brokerage Firms in a MiFID Context, 2010, p.23. Available at SSRN: http:/
[xii] Alan Morrison et. al., op. cit., p. 72.
[xiii] Evangelos Benos and Satchit Sagade, High-frequency trading behaviour and its impact on market quality: evidence from the UK equity market, Dec 2012, Working Paper No. 469, Bank of England.
[xiv] Ibid., p.3
[xv] Ibid., p.4
[xvi] Ibid., p.5
[xvii] Ibid., p.6
[xviii] Ibid., p.23
[xix] J. Doyne Farmer and Spyros Skouras, ‘An Ecological Perspective on the Future of Computer Trading’, The Future of Computer Trading in Financial Markets, UK Foresight Driver Review – DR6, 2011, p.8.
[xx] Ramazan Gencay, Michel Dacorogna, Ulrich Muller, Oliver Pictet, Richard Olsen, An Introduction to High-Frequency Finance, 2001, Academic Press, p.353.
[xxi] J. Doyne Farmer and Spyros Skouras, ‘An Ecological Perspective on the Future of Computer Trading’, The Future of Computer Trading in Financial Markets, UK Foresight Driver Review – DR6, 2011, p.12.
[xxii] Andrei Shleifer and Lawrence Summers, ‘The Noise Trader Approach to Finance’, Journal of Economic Perspectives, Volume 4, Number 2, 1990, pp.19-33.
[xxiii] Despite the dialectic of efficiency and inefficiency there is a general trend toward efficiency indexed by the fall in bid-ask spreads. James Angel,Lawrence Harris, and Chster S. Spatt, ‘Equity Trading in the 21st Century’, Marshall School of Business Working Paper No. FBE 09-10, 2010. Available at SSRN: http:/
[xxiv] Evan Calder Williams, Combined and Uneven Apocalypse, Zero Books, 2011, p.188; Amadeo Bordiga argues that capital functions not just through the 'creative destruction' that Shumpeter identifies, but also through a 'destructive destruction' necessitated by the build-up of dead labour. Amadeo Bordiga, 'Murder of the Dead', Battaglia Comunista, No. 24 1951; http:/
[xxv] Marc Lenglet, ‘Conflicting Codes and Codings: How Algorithmic Trading is Reshaping Financial Regulation’, Theory, Culture & Society November 2011, 28: 44-66, p.2; Fabian Muniesa, Des marchés comme algorithmes: sociologie de la cotation électronique à la Bourse de Paris, Thèse de doctorat (PhD Thesis), Ecole des Mines de Paris, 2003.
[xxvi] James Angel,Lawrence Harris, and Chster S. Spatt, ‘Equity Trading in the 21st Century’, Marshall School of Business Working Paper No. FBE 09-10, 2010, p.2. Available at SSRN: http:/
[xxvii] Ibid., p.3
[xxviii] Neil Johnson, Guannan Zhao, Eric Hunsader, Jing Meng, Amith Ravindar, Spencer Carran and Brian Tivnan, ‘Financial black swans driven by ultrafast machine ecology’, arXiv, 7 February 2012.
[xxix] Didier Sornette, Why Stock Markets Crash, Princeton University Press, 2003.
[xxx] Michel Callon and Fabian Muniesa,’Les marchés économiques comme dispositifs collectifs de calcul’, Réseaux 21(122), 2003 or ‘Economic Markets as Calculative Collective Devices’, Organization Studies, 26(8), 2005, p.1236; Alexander R. Galloway, The Interface Effect, Polity, 2012. pp. 25-54.
[xxxi] Aldridge offers a broad description of different ‘algorithmic’ classes into electronic, algorithmic, systematic, high-frequency, low-latency, market making, etc. Irene Aldridge, 'The Evolution of Algorithmic Classes', The Future of Computer Trading in Financial Markets, UK Foresight Driver Review – DR6, 2011, p. 4.
[xxxii] Irene Aldridge, High-Frequency Trading: A Practical Guide to Algorithmic Strategies and Trading Systems, 2010, John Wiley & Sons
[xxxiii] Irene Aldridge, 'The Evolution of Algorithmic Classes', The Future of Computer Trading in Financial Markets, UK Foresight Driver Review – DR6, 2011
[xxxiv] J. Doyne Farmer, Spyros Skouras, An ecological perspective on the future of computer trading, p.6
[xxxv] Anton Golub, Overview of High Frequency Trading, Manchester Business School, April 15, 2011
[xxxvi] J. Doyne Farmer, Spyros Skouras, An ecological perspective on the future of computer trading, p.6
[xxxvii] Ibid., p.16
[xxxviii] Donald MacKenzie, Daniel Beunza, Yuval Millo, Juan Pablo Pardo-Guerra, Drilling Through the Allegheny Mountains: Liquidity, Materiality and High-Frequency Trading ,2012, p.9, http:/
[xxxix] James Angel, Lawrence Harris, Chester S. Spatt, Equity Trading in the 21st Century, Marshall School of Business Working Paper No. FBE 09-10, 2010, http:/
[xl] Ibid., p.35.
[xli] Themis Trading 2008, 2009
[xlii] Donald MacKenzie, op. cit.
[xliii] U. Dieckmanna, P. Marrow, R. Law, 'Evolutionary cycling in predator-prey interactions: population dynamics and the red queen' Journal of Theoretical Biology, Volume 176, Issue 1, 7 September 1995, pp.91–102; G. D. Ruxton, T.N. Sherratt & M.P. Speed, 'Avoiding Attack: The Evolutionary Ecology of Crypsis, Warning Signals and Mimicry'. Oxford University Press, 2004.
[xliv] Ray Brassier Nihil Unbound: Enlightenment and Exctiction. Palgrave Macmillan 2007. p.43
[xlvi] Ibid., p.47
[xlvii] Foresight: The Future of Computer Trading in Financial Markets (2012) Final Project Report. The Government Office for Science, London
[xlviii] Jonathan Brogaard, High Frequency Trading and Its Impact on Market Quality, 2010. Hasbrouck, Joel and Saar, Gideon, Low-Latency Trading (December 2012). Johnson School Research Paper Series No. 35-2010; AFA 2012 Chicago Meetings Paper. Available at SSRN: http:/
[xlix] Terrence Hendershott and Ryan Riordan, Algorithmic Trading and Information, 2009. Menkveld, Albert J., High Frequency Trading and the New-Market Makers (February 6, 2012). EFA 2011 Paper; AFA 2012 Paper; EFA 2011 Paper. Available at SSRN: http:/
[l] Kirilenko, Andrei A., Kyle, Albert S., Samadi, Mehrdad and Tuzun, Tugkan, The Flash Crash: The Impact of High Frequency Trading on an Electronic Market (May 26, 2011). Available at SSRN: http:/
[li] Donald MacKenzie, op. cit., p.20
[lii] Ibid., p.18
[liii] J. Doyne Farmer, op. cit., p.6.
[lv] On May 6th 2010, the US stock market experienced one of the most severe price drops in its history; the Dow Jones Industrial Average (DJIA) index dropped almost 9% from the beginning of the day - the second largest point swing, 1,010.14 points, and the biggest one-day point decline, 998.5 points, on a
The opinions and writing contained in this article are of the author alone and do not necessarily represent those of HFTReview.com.