BOOKMARK AND SHARE
Fri, 07 Sep 2012 11:05:49 GMT
This much is obvious, but when it comes to proving that value, working out what each fragment, stream or supplier is worth requires constant measurement in a controlled, standardised and comparable way.
This is no mean feat, and the best ways in which to get those results to make informed decisions about your data providers is one of the most contentious issues in financial data management right now.
To try and uncover some of the truths, trends and practical steps to squeeze the most out of your data, The TradeTech Blog spoke to Hany Choueiri, Global Head – Entity Data Quality (GB&M) at HSBC.
You will be discussing how to ‘measure the value of data every step of the way’:
a. How is it possible to usefully measure the value of data and is this achievable before the end result of its usage been reached?
The answer to this very much depends on the data in question. There are instances where the value of data can be measured and directly correlated to a time-framed business benefit and seamlessly form the basis of the business cases for data quality efforts. Examples of these are common in financial reporting where, for example, the credit ratings of clients can have a material impact on capital reserves.
However, there are also many instances (as your question eludes to) where this correlation is not straightforward and the value is a longer term contributor to a strategic driver, a regulatory requirement, poses operational risks or even forms part of the business case that supports a change initiative. In many such cases, the Data Management Organization/Data Quality can agree a “data value formula” with business stakeholders that can be applied to estimate the ultimate value (which coincidentally can also be used to prioritize resources). This can, for example, include estimates of regulatory fines or simple probability techniques for operational risk scenarios. Once the value has been determined or estimated, simple time based value can be inferred by tracking the “% completion” of the population – similar to the “earned value” concept in project management.
I would say the biggest challenge is in determining & agreeing the actual or estimated value of the data; once this is done, measuring the value at “every step of the way” is in my view the easier piece.
Read the full Q&A at fima-europe.com
Related posts:
- Malcolm Chisholm on Data Management
- Regulation, Data & The DACH Region
- How is regulation affecting data management in Germany, Austria and Switzerland?
This blog post was sourced via RSS/Atom, click here to view the source.
Related content
Blog: Fragmentation of Banks is Good for Business
TradeTech Blog 24 August 2012
Blog: LEIs: Details More Important than Deadline
TradeTech Blog 16 August 2012
Blog: Regulation, Data & The DACH Region
TradeTech Blog 1 August 2012
News: Terasic to release new Stratix V FPGA Boards
13 August 2012 – Terasic
To fulfill the design needs that demand high speed, advanced memory interfacing, and the highest logic capacity, Terasic has just announced a host of new FPGA boards for tackl…
News: smartTrade named best liquidity management system at 9th annual e-FX awards
13 July 2012 – smartTrade Technologies
New York, July 13, 2012. smartTrade technologies was named winner of this year’s best liquidity management system by FX Week. Winners were announced at Tuesday night’s aw…
Blog: Securing The Future And The Bottom Line With Data Management
Sybase 6 September 2012
Blog: In-Memory We Trust?
Steve Graves 31 August 2012