HFT Review


    Thu, 19 Jul 2012 06:10:00 GMT

    This is part four of a six-part series by Mike O’Hara, looking at the usage of FPGAs and other types of hardware acceleration in the financial trading ecosystem.

    In previous articles in this series, we looked at how, where and why FPGA technology is currently being used in the financial markets, we investigated some alternative approaches to hardware acceleration, and looked at various programming methods.

    In this article, we identify some of the challenges and constraints that firms face when working in a hardware, as opposed to software, environment.

    The FPGA adoption path is complex.

    Banks and trading firms who want to put together a coherent FPGA-based strategy have to overcome a number of obstacles if they want to be successful.

    First of all (as we touched upon in part three of this series), programmable hardware requires a completely different approach to software, and software engineers do not typically have the requisite skill sets to work at the hardware level. Trying to get high-level programmers coding down to FPGA generally doesn’t work because most of them don’t have the deep understanding of electronics and circuits that is needed.

    The majority of firms working with FPGAs do actually recognize this and therefore try to recruit at least a few staff members who know their way around the FPGA world. The challenge however, can be finding the requisite combination of financial and engineering skills. According to Simon Garland, Chief Strategist at Kx Systems, people who can easily bridge that gap are in short supply.

    “They are around but there aren’t many of them and they command a very high price,” he says. “There are so many interesting projects for them that you might be able to recruit them to help you get your fantastic new application up and running in production, but as soon as it moves to maintenance, it’s suddenly rather boring so they’re off to the next gig. Then you’re left with something that is near impossible to maintain”.

    It is certainly true that finding good people with the right skill sets can be tricky. But isn’t that true in any area of business that relies on technology? Do hardware engineers actually need finance expertise? Maybe the answer is to get a good business person to point the FPGA engineers in the right direction and a seasoned technology manager to take charge of the deliverables.

    Matt Dangerfield, Chief Technology Officer at Fixnetix, stresses the importance of having the right team in place.

    “The key thing is you need dedicated people,” says Dangerfield. “It’s not like you can take your existing development force of C & Java guys and bring them across to this. You definitely have to put a lot of money into building a dedicated team, and then you’ve got to work out from a business point of view whether you’re going forward more with software or with hardware or both. So you really have to think about what you’re going to do before you get into it otherwise it could be quite costly,” he says.

    Staffing is one thing, but another – possibly more fundamental – challenge, is deciding how and where FPGAs should be used. Terry Keene, CEO of iSys Integration Systems, believes that firms who don’t understand the technology can fall into the trap of trying to put systems that are too complex into an FPGA environment, whereas what they should be doing is looking at where the technology has been proven to be more successful.

    “Firms realise they can buy FPGA-based solutions from vendors who have done the straightforward stuff like TCP offload, FIX/FAST translation, feed handling and so on,” says Keene. “But HFT firms also realise that the only way to gain an advantage now is to come up with better, more clever, more efficient, more productive and more profitable algorithms. That’s going to be their differentiator, but not many HFT firms have had much success putting their own trading algorithms onto FPGAs.

    “If they just want to get their market data in faster and then convert it into something they can use,” Keene continues, “FPGAs can do that really well. It’s a single, static function that’s easy to take advantage of. But when you start getting inside the algorithms themselves, that’s when you get to the harder issues. FPGAs are a lot dumber than general purpose CPUs. They’re faster but they only do one thing. So if you want to go beyond just format conversion into data stream filtering, you have a problem because those things change, which means you’ve got to go back and reprogram your FPGA,” states Keene.

    The key point here is that FPGAs can facilitate quicker, more deterministic algos when doing things they are good at, like A/B arbitration, session management, symbol translation and so on. FPGAs are not computational – they are basically electrical circuits – so the performance gains come from how logic units can be duplicated in order to run in parallel, rather than the number of programmatic instructions.

    The programmable aspect of FPGAs does of course give them the flexibility to be updated in the field but the testing process is still more complicated than software, as Matt Dangerfield of Fixnetix explains.

    “You’re working with increased development times and cycles,” he says. “When you go from writing 5 lines of code in Java or 50 lines in C to 300-400 lines in VHDL, there’s obviously a time impact. The good thing about the hardware side is that once it’s done it’s done, you don’t have to keep trying to optimise it or re-engineer it, so where you lose development time on one side of the coin, you gain with implementation and ease of support on the other side. In software, the development side might be cheap, light and easy to do but the support and ongoing maintenance side becomes a burden.”

    However, there are additional challenges around deployment of FPGAs and just getting hold of the kit, according to Dangerfield.

    “There’s a lot more project management going on,” he says. “A lot more planning is needed in the hardware world than in software. When you want cards made for specific purposes, you have to plan that well in advance.”

    Anant Pandit, Executive Vice President and Founder of Omnesys Technologies, an integration solutions provider based in India, agrees that firms need to consider the logistics of FPGA deployment.

    “With software you are able to deploy a component and share information anywhere on the network,” he says. “But it’s a conceptual shift when you go into hardware deployment, because you can’t move things around, it’s all fixed to the card.

    “For example, if you were to build a software module and deploy it somewhere on the network, it would automatically disclose itself, it knows where it’s running, it seeks out who it needs to connect to, it connects itself and it starts working. With an FPGA, there’s a lot more setup information that needs to be created and a context that needs to be built around it before it knows what it is doing. And all of that can only run where the FPGA is actually located,” explains Pandit.

    Pandit’s point may seem obvious, but it is certainly something to consider when figuring out the logistics of how and where the FPGA hardware is to be deployed.

    The final issue to consider is around ROI. This is still relatively new technology in the finance space, which is creating opportunities for some and leaving others finding it difficult to adapt. But Terry Keene believes that some firms have had to make significant investment just to keep up.

    “Firms who have never done this before have had to hire the right people, then get the right algorithms going, then test them and so on. If it costs some firms a couple of million dollars and 18 months just to get these things done and then find everybody else is there too and the playing field is level, then they haven’t gained anything and they’re two million dollars down. But then if they hadn’t done it, they would now be behind the curve,” he says.

    Because there are still very few clear ROI models available around FPGA adoption in financial trading, it is difficult to know how accurate Keene’s assessment is. But there are many successful FPGA developments in the military and telco spaces, so this is certainly a tried and tested platform. And firms don’t necessarily have to invest millions of dollars to test things out on a small scale.

    While it is true that incorporating FPGAs into a firm’s technology strategy does require a certain level of investment and the acquisition of new skill sets, it is also true that many firms who have made that investment are now reaping the benefits, even if others have struggled to do so.

    But in this business, there’s no reward without risk.

    In part five of this series, which will be published at www.hftreview.com/pg/fpga in late August, we will look at how chip manufacturer Intel is responding to the FPGA threat.

    Related content

    News: Fixnetix Introduces Exchange Membership Service Software: iX-EMS
    30 May 2012 – Fixnetix
    Immediately accessible iX-EMS targeted at mainstream European brokerages; responds to financial technology cost challenges whilst significantly lowering latencies.  Lond…

    News: Omnesys Technologies and Progress Software Announce Strategic Partnership In India
    14 November 2011 – High Frequency Traders
    Omnesys Technologies, a leading provider of software for securities markets worldwide and Progress Software, a leading software provider that enables enterprises to be opera…

    News: Arista Networks Unveils Versatile Software Defined Switching Platform for Demanding Applications
    20 September 2012 – Arista Networks
    New Arista 7150 Series Offers 40GbE, Lower Latency,   Flexible Forwarding and VXLAN support  SANTA CLARA, Calif., Sep.19, 2012 – Arista Networks today…

    News: Intilop to Exhibit New TCP Offload Engine at HPC Wall Street
    18 September 2012 – Intilop
    Santa Clara, Calif., Sept 14 — Intilop, Inc. a pioneer and a recognized leader in providing Ultra-Low latency networking Mega IP building blocks, systems and soluti…

    Leave A Reply