- August 2012
- July 2012
- June 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
BOOKMARK AND SHARE
Mon, 25 Jun 2012 01:42:00 GMT
This is the first part of a six-part series by Mike O’Hara, looking at the usage of FPGAs and other types of hardware acceleration in the financial trading ecosystem.
Although FPGA (Field Programmable Gate Array) technology has been around since the mid-1980s, it is only within the last two to three years that it has been adopted to any real extent in the financial markets.
Originally developed for the defence and telecoms industries, FPGAs are basically a type of computer chip containing logic blocks and interconnects that can be programmed “in the field” using a hardware definition language (such as VHDL or Verilog), to perform a range of functions that would otherwise be performed in software.
Financial market participants who are latency sensitive, particularly high frequency trading firms, are interested in FPGAs for a couple of reasons. First, FPGAs can speed up some processes by at least an order of magnitude versus software running on a general purpose CPU. Second, the deterministic latency that such hardware provides virtually eliminates the levels of standard deviation or jitter that you would typically experience in the software world.
Nick Ciarleglio, Systems Engineer and FSI Product Manager at Arista Networks, explains that it is not just proprietary HFT firms who are investigating and adopting this technology.
“If you break the market into four tiers of participants: end users; broker-dealers or DMA providers; exchanges; and market data providers, each one of those types of entities has different applications that they can accelerate and potentially embed in hardware”, he states.
“There are maybe 25-30 end-user firms around the world that are fully capable of developing complete high performance applications in FPGA today”, says Ciarleglio. “Those firms have individuals on staff who know Verilog or VHDL, they have their own dev/test/qa environments and they can actually deploy the applications in the live market and be successful. But those firms also compress the market. As they figure out how to do market data normalisation in less than a microsecond or to send a TCP transaction in a couple of hundred nanoseconds, essentially the whole market shifts”.
Ciarleglio believes that the market is now moving from this cutting edge phase towards wider adoption of embedded FPGA technologies.
“The second tier may not be as savvy but will definitely be looking for providers and integrators to help them out”, he states. “The firms on the cutting edge will spend time and resources on this to gain a few hundred nanoseconds of advantage, whereas the early adopters are the people who just need to be in that top tier of a particular market to stay competitive. When they notice that they’re no longer competitive they have to move. There are probably another hundred end-user firms around the world that I would lump into that category, including groups within investment banks with slightly broader applications.
“Right now we’re between phases one and two”, believes Ciarleglio. “Building complete applications on FPGA is still pretty new, a lot of the applications in use today still require a decent amount of CPU. Firms who have developed completely independent applications running in FPGA are still rare. But two to three years down the road when it becomes much more mainstream, when there’s much more IP built, when there are many more solution providers with very well defined blocks and turnkey solutions, at that point you will see this type of hardware go mainstream in the market”.
It is clear that the market is currently shifting in two ways. The first is that any end-user firm wanting to be a purely speed play needs to be at the same latencies as the cutting edge firms. The second is that early adopters who are reliant on their broker-dealers are starting to push them towards having the same levels of latency. This means there are now multiple points in the market, including the broker-dealers and the exchanges, seriously looking at how they can use this technology just to stay competitive.
Despite the obvious benefits of FPGAs however, many firms are still cautious about adopting them. As Ciarleglio points out, hardware embedded technologies – in any kind of trading or market data application – are still fairly new. And although they have been proved viable by third party vendors and by firms already using them in production, much of the true application development, where more and more of the application becomes embedded in the hardware device rather than being CPU reliant, is still considered “bleeding edge” by many.
So how are these early adopters actually using FPGAs? What kinds of things are they doing with them and where do they actually fit into the trading “ecosystem”?
The first area is around low-latency connectivity for inbound and outbound data, where all network connections are possible candidates for an FPGA-enabled Network Interface Card (NIC) to give that extra latency boost. Such a card might be using the FPGA to run a TCP Offload Engine for example, which can both free up CPU cycles and reduce PCI traffic.
Other areas where FPGAs are starting to make a significant impact are market data feed handling, pre-trade risk controls and other processes where firms need to be able to take in data then run calculations or simulations on that data in-line at high speed. Applications are now increasing as people get more comfortable with the technology and firms are looking at pure acceleration of tasks that they would previously have done using CPUs or General Purpose GPUs.
There are a growing number of vendors offering off-the-shelf and tailor-made solutions for performing such tasks. Firms such as NovaSparks and Exegy provide FPGA-based appliances that parse and filter market data for delivery into trading applications at high speed with highly deterministic low latency. Others, such as Fixnetix and GATELab, offer FPGA-based solutions that perform a range of pre-trade risk checks on a firm’s order flow at sub-microsecond speeds.
Despite the fact that such vendor solutions are now available on the market, many end-user firms remain committed to an internal development approach when working with FPGAs. Brian Durwood, CEO of Impulse Accelerated Technologies, a vendor specialising in software-to-hardware technologies that enable FPGA development, works with such firms.
“The big firms are shopping with huge checkbooks and they’re buying appliances. They’re buying a box where someone else is doing the coding. We’re more the kind of people who like to understand where all the bits are so we can tinker with them and tune them. So most of our clients are classic early adopters, the 20-200 person trading firms who like to do things themselves”, says Durwood.
What sort of applications are these high-tech trading firms developing in FPGA?
“Most of them are doing some variation of ticker plant”, responds Durwood. “Some are just looking at a handful of stocks and absolutely hot-rodding those, others are trying to convert a 150-deep book into automated trading. It’s still at the immature phase where different people are trying different approaches and taking risks.”
Firms following an in-house development route do not have to do everything themselves however, because they can use existing IP blocks for specific purposes, as Durwood explains.
“One of the things that comes out of this methodology is that firms end up putting together their own library of known good modules, so they might collect a range of different component blocks. If you’re building your own solution, you end up acquiring as many of these good code blocks as you can from people who know they work already, which means you don’t have to focus on that, you can focus your energies where you need to. This gives you a pretty powerful approach”, he says.
Taking care of tasks like parsing, filtering, normalisation and session management is where FPGAs can add a real advantage, so market data delivery and distribution is ripe for this technology. Working with order and trade data brings additional complexities in development and testing however, because the order can go through so many state transitions.
Despite such complexities, FPGAs are now being used in FIX engines, rules engines, and even full-blown execution management systems. Ferdinando La Posta, Co-Founder of trading solutions vendor GATELab, explains his firm’s approach.
“We decided to implement a full FIX 4.4 engine, with pre-trade risk checks and market transactional gateway (on top of the TOE – TCP Offload Engine) in the hardware, in order to provide a wide range of customers with immediate connection”, he says. “We didn’t focus on market data in the first stage, because essentially many pre-trade risk checks are not bound to price depth, but to last trade price and level 1, which we feed through a PCIe bus into the FPGA in order to implement the risk controls. We can also use other third party vendors that already provide a market data gateway on FPGA, to cross the incoming market data through a PCIe into our board”.
But FPGAs are not necessarily the answer to everything, as Simon Garland, Chief Strategist at Kx Systems, a firm that provides tools for processing real-time and historical data, explains.
“Use of FPGAs has started to go mainstream for things like data compression and feed handling, but that’s stuff that just works, it works well and it’s really fast. So you can buy it, it’s a black box and the fact that it’s running on FPGAs is completely irrelevant. You’re assuming that somebody else has programmed it, debugged it and it’s rock solid and away you go”, he says.
But when it comes to actually working with FPGA technology internally, Garland sounds a note of caution.
“The technology is certainly being used by some of the very sophisticated high speed trading firms. They’re using FPGAs extensively and in very clever ways. But they’re exactly the people who can get and keep the very best programmers. The view that “everybody else is doing it so we should be doing it too” is incorrect, it’s just the firms at the very high end who have staff on hand capable of ‘pulling out the soldering iron’, metaphorically speaking. And that just doesn’t translate down”, he warns.
In conclusion, it seems that although FPGAs can offer some real advantages in a wide range of areas, the technology comes with a number of challenges.
Part two of this series, published at www.hftreview.com/pg/fpga looks in more detail at the various different architectural approaches to implementing FPGAs.
Related content
News: Technology Consortium moves to accelerate FIX Protocol messaging to support high performance platform for trading
24 August 2012 – News – Rapid Addition – leading FIX and FAST solutions provider
January 2012: briefing paper will detail performance benchmark figuresLONDON and NEW YORK – December 19th 2011 – A group of leading IT infrastructure vendors today unveiled …
News: ENYX and ACCELIZE announce a joint partnership to provide ultra-low latency FPGA-based trading solutions to the financial services industry
15 June 2012 – Enyx
The combination of ACCELIZE leading-edge FPGA computing platforms and ENYX recognized financial market and FPGA expertise provides trading firms with unprecedented techni…
Blog: How to Run 20-plus Pre-Trade Risk Checks in Under a Microsecond
HFT Review 20 September 2011
Blog: In Defense Of High-Frequency Trading (HFT)
Matt Nadell 2 July 2012
News: Altera Previews Productivity Benefits of OpenCL for FPGAs Through Early Access Program
28 August 2012 – Altera
Customers Receive a First Look at How OpenCL can Simplify FPGA Development San Jose, Calif. August 28, 2012—Altera Corporation (Nasdaq: ALTR) today announced its OpenCL…
Blog: FPGA & Hardware Accelerated Trading, Part Five – The View from Intel
HFT Review 23 August 2012
Blog: The Evolution of FPGA in the Financial Markets
AdvancedIO Systems 25 June 2012