Garbage in, garbage out

Alex Hungate, chief marketing officer at Reuters, talks to FinanceAsia about market reception for the new STP market utilities and the importance of reference data.
Q. In the past few months many of the industry initiatives created to boost STP have finally gone live û GSTP and Omgeo's virtual matching utilities, SWIFTNet and the CLS Bank. What is your view on their reception in the market? The market conditions in 2002 have been challenging. Firms are telling us that their commitment to STP û by way of reducing costs, improving controls and delivering superior customer service û remains unchanged. What has changed has been the lack of inclination on the part of the buy-side to invest behind missions such as T+1. The success of VMUs running centralised matching will depend to a large extent on the ability of firms to improve the quality of their reference data û otherwise, the GIGO (garbage in, garbage out) principle applies. WeÆll similarly be keeping an eye on the migration process at SWIFT with a view to helping SWIFTÆs customers manage the transition to new environments, be that ISO 15022-compliant data dictionary-style messaging or the SWIFTNet IP environment. Q. Has the removal of the deadline for moving to T+1 in the US lessened the importance of these initiatives in the minds of market participants? Far from it. Whilst the industry has moved away from focusing on the time it takes to settle a trade (T+1) there is a growing emphasis on ways to reduce the costs of trading, clearance and settlement. Firms are looking to reduce costs, but also contain them through intelligent design of workflow solutions, as well as avoid them in future by converting fixed into variable costs wherever possible. In the current business climate many financial institutions are implementing straight-through-processing solutions to achieve this aim. Q. Poor quality and management of reference data has been highlighted as a key cause of inefficiency in the market by many surveys and reports. Are financial institutions beginning to address this issue, or are other issues, such as the mandatory migration to ISO 15022 messages, pushing it to the back of their plans? The need to address this problem is well recognised by the industry. Recent Tower Group research published in September 2002 found that 61 per cent of financial institutions view reference data projects as top or high priority and only 5 per cent had failed to allocate budget to addressing this issue. Even more telling, 51 per cent of buy-side firms reckoned on reference data being a top or high priority for their firm, which is a telling indicator for sell-side firms, custodians and market infrastructures that the buy-side is becoming more active in dictating the pace of this issue. Q. Last year Reuters entered into a joint venture with Capco to deliver a reference data management service on an ASP basis. Can you explain briefly how this works? The Reuters Reference Data Manager service forms one part of the Reuters Data Management Solutions program, that has been designed to assist financial institutions by ensuring their data meets the requirements of STP and delivers cost and operational efficiencies. Reference data is the key element, or DNA, of the transaction and operational functions performed by participants within the financial services industry. The data has to be accurate if costly mistakes are to be avoided. The Reuters Reference Data Manager, which is an application service provider (ASP) service, analyses data held across different and geographically dispersed business systems and translates it into a common format. This creates a single view of the reference data used by the organisation, enabling inaccuracies or incomplete items to be easily identified and amended. However, it represents only one part of solving the complete reference data issue for the client. Q. What benefits can be gained from accessing an ASP service rather than running your own reference data system? Clearly, this is a matter for individual customers to decide. Although reference data is a critical component, running specialist components such as legal entity databases is a costly exercise, not least in view of the need to continually keep the information refreshed in view of changing client arrangements.. Given the need to focus on priorities and core competencies, I would therefore not be surprised if many medium or smaller size firms examine the business case of running their own databases and looked to inventive ASP and BSP solutions in the marketplace over the coming two years. I would imagine that custodian banks, service bureaux, data vendors and market infrastructure solution providers would all be well-positioned to deliver such a service. I would caution that issues such as liabilities and standards might play a key role in deciding how all this plays out.