What's Wrong With Complex Event Processing?
Fri, 06 Jan 2012 08:32:39 GMT
I spend a significant amount of my time keeping up with advances in processing high velocity big data. Over the last year, I’ve watched the NoSQL camp grow a lot. And now, some folks are even forecasting a market approaching $2 Billion USD by 2015. The last time I saw that kind of trajectory for a new software category was for Complex Event Processing. So without casting any undue aspersion on the NoSQL camp, let’s talk about why CEP has so dramatically failed to generate the returns venture capital firms were so sure they were going to achieve.
WHAT IS EVENT PROCESSING?
Event Processing, or Event Driven Architectures, means nothing more than processing an event one event at a time; preferably sometime shortly after they occur. The opposite of this is Batch Processing, which means batching events, or messages, or what most of the world would call a row, of data and processing them together. In batches. Sounds simple enough, right? All of you reading this blog post have used an Event Driven Architecture. In fact, you’re using one now – it’s in your browser. Can you imagine what the user experience would be if your browser ‘batched’ up all of your mouse clicks and submitted them every 30 seconds? Event Driven Architectures promise the same type of agility and increased user experience for line of business and consumer applications that you’re experiecing right now. In fact, it’s probably hard to think about using the web in batch mode – it just doesn’t make sense.
WHAT IS COMPLEX EVENT PROCESSING?
For the most part, a marketing phrase. That’s right – and again, for the most part, it’s completely meaningless. As an early and continuing contributor to this particular area of technology, I remember when StreamBase, Apama, myself, and others called this field Event Stream Processing. Then one of those firms marketing departments decided to differentiate. I’ll leave the specific firm to your intuition. So, what is Event Stream Processing? That’s much easier to answer. Event Stream Processing is Event Processing with four additional key components:
1. Continuous Query
Rather than having to poll a server for an event, using ESP , the user of the system issues a query and is subsequently informed with events, aggregations, or patterns that satisfy the specifics of the query. This happens continuously, until you stop the query.
2. Windows (Time and/or Length)
Using ESP, the user can ask, as an example, for an average value of some key over either a time or length window. Something like, ‘Give me the average amount of time people have spent on the homepage in the last 10 minutes.’ This query would provide an updated average either continuously, or perhaps at regular intervals.
3. Pattern Matching
With Pattern Matching, I’m able to define a series of events that fit a pattern, and then be notifified when that pattern is observed. Usually within some Time or Length Window. So, I might ask, “How many users are going from the “Home Page” to the “About Company Page” and then clicking on “My Profile” during a rolling 10 minute window”.”
4. A Language
Tying all of the above together in a neat little language is a cool idea – it makes using these features easier. At least, that’s the theory. And this is one place where CEP has gone wrong and is not the general computing revolution that myself and others have hoped for. I’ll expound upon this after a brief distraction in the next 2 paragraphs. Please bear with me.
WHERE IS COMPLEX EVENT PROCESSING USED?
Even Mark Palmer, who is usually extremely bullish about CEP and probably sprinkles it on his breakfast cereal, has recently admitted the CEP is only hot in Capital Markets. While I might disagree a bit with Mark, which is nothing new, I think we can all agree that CEP, at the $200M total market size is far less than we had all hoped for. Frankly, it reminds me of the FIX engine vendor battles – I was an early provider there too – and we all ended up fighting over an ever shrinking market place.
WHAT IS THAT MARKET ANYWAY?
The current vendor set of CEP tends to focus on Capital Markets. But not really. It focuses on an even smaller slice of Capital Markets called High Frequency Trading. Seems more people know more about High Frequency Trading today than CEP. The important thing here is the what all the smart analysts are calling the “CEP Market” really isn’t the “CEP Market” at all. It’s the HFT software market. And again, looking at the impressively long list of clients that Mr. Palmer has cited in his recent blog post, many of those firms aren’t actually using CEP for HFT, but an even smaller subset of functionality. That’s why the market is so small – if any VC firm thought the total addressable market for this technology was going to be $200M in 2010, no CEP startup would have received funding. And when HFT finds the Next Big Thing, the CEP market, as defined today, will evaporate. And along with it, any CEP vendor who has concentrated solely upon that market.
SO WHAT HAPPENED?
The idea was that we were ushering in a New Way To Compute Things. Like all technologists who spend way too much time thinking about this stuff, we thought everyone would immediately see how smart we were, run out and buy one of the CEP based products, and join is in revolutionizing how data is turned into information and used by business folk to make money and pay our salaries. The only problem is, we forgot 2 things; 1) who would be using our software to do this work, and 2) who would subsequently be using the applications developed by 1.
DEVELOPERS – A FINICKY BREED
I used to be a Real Developer – I wrote in C++. Then Sun decided that the Internet was the Computer and we all started to learn Java. Java is cool – Java makes it easy for anyone to write bad code whereas C++ really took some effort to mess things up. More and more people started using Java for everything; servers, clients, web stuff, etc. And now, I’m not sure what people use anymore – perhaps coders are using NoJava for all of their no shiny NoSQL apps. I still use Java. And I’m loathe to learn another language. See #4 above in ‘What’s CEP?” I don’t want to learn another language. And I certainly don’t want to move all of my work; servers, clients, webapps, etc. to a new and unproven language. And no matter which vendor you don’t choose for your CEP application because you write it all yourself anyway, none of their languages can claim to be broadly or generally adopted. Proof? Try to buy a book on one of them. There are umpteen books out on NoSQL in less time than it took some CEP vendors to go out of business. CEP vendors have failed to appeal to core IT departments. Period. And core IT departments are the folks who have to assemble all the crap they buy from vendors into something that business users get to complain about when it doesn’t work.
BUSINESS USERS – “SHOW ME SOMETHING!”
Business users want to see information. They want to see information presented crisply; ready for decision making. And today, more than ever, they want to see it on their web browser, iPad, iPhone, Droid, Apple TV, disconnected lap top, on flat panels on the front of their refrigerator in the kitchen and on the heads up display in their car while commuting to work. In short, they want information any time they want information so that they can function in what has become, and will continue to become, an ever faster and more connected world. Even Progress Apama, who I think is doing really well, uses Flash based instrumentation. No iPad for you! There is no CEP environment that let’s the IT folks build a complete application for the business user. So the business user never ‘SEE’S” CEP. So they’re not impressed. They don’t get it. And they don’t provide budget for stuff they don’t get.
AND IN CONCLUSION
CEP has failed to achieve the multi-billion dollar market forecasts that we all went out and raised money based upon because most vendors have failed to provide the education and tools necessary to create the complete user experience. Most of the CEP vendors don’t even have their own visualization products – they partner with other vendors to provide things like Tree Maps or Dash Boards. How they expect to revolutionize the world by outsourcing their interaction with the Business User is beyond me.
If the NoSQL camp would like to come anywhere close to realizing the crack smoking analysts’ estimates of a $2B market, they should 1) make the technology readily accessible to the IT department (which they’re doing) and 2) make sure that the business users knows why it’s making a difference. If they can reach out and touch the business user, or consumer, all the better.
AND THANKS FOR READING