Stepping Up Outsourcing: Getting the Most Out of Data

In today’s environment, banks are focusing increasingly on optimising their trading processes in the front office, at the same time as improving their operational efficiency in the back office. This has been driven, to some extent, by an increase in multi-asset trading as firms endeavour to diversify their investment portfolios, but more so by global financial regulation. As a result, we have seen a rise in demand from financial institutions for a consolidated view of risk and reporting across asset classes traded using various front office systems.

However, for enhancements to these processes to be worthwhile market participants must ensure that the data fed into these systems is accurate, accessible and secure as well as usable and well-governed. Furthermore, consistency between front office risk management data and back office financial accounting data is more critical than ever to demonstrate to regulators and clients alike that the right systems and controls are in place.

In an effort to achieve greater efficiencies across their front and back office processes, banks in particular are becoming increasingly cognisant of the value in leveraging third party providers which can prevent costly repetitive technology development, and can mitigate the problems caused by legacy systems. In fact, as service delivery has become more sophisticated, another way is emerging for sell side institutions amid these challenging conditions. “Hotsourcing” is an option for institutions looking at the next generation of their technology requirements. This is the term given to an approach of outsourcing, on the right modular technology, which allows for partial rather than full outsourcing of operations.

As a result, banks and brokers can be selective, creating an optimised blend of the best of their own resources while using cost-effective operational processing. This means using the best of technology advancement – automation, straight through processing and the cloud – while retaining senior oversight of operational processes in-house, which is essential in a regulated environment. Rather than doing away with legacy systems entirely, this allows for a phased approach where institutions can test outsourcing of different business processes to find the most effective solution. The savings that can be achieved by deploying these changes are significant, meaning that firms can still reduce spend, while retaining in-house knowledge and expertise.

What is more, upgrading systems to reduce manual processes and replace legacy hardware can help make firms more competitive, efficient and ease the compliance burden. In today’s regulatory maze, this can certainly be considered a competitive differentiator for sell side institutions.

In an effort to improve operational efficiency, however, firms that choose to outsource their operations should be reminded of the fact that technology is no magic bullet and can only be as good as the data it processes. In fact, these firms still remain responsible for the quality of the data. Indeed, the ability to feed accurate date into post trade operations will reduce the volume of transaction failures, decrease the need for manual reconciliations, and enable quicker and more accurate confirmation and settlement processes. This is all the more important given the impending introduction of the Central Securities Depositories Regulation (CSDR) which will require European firms to settle their trading obligations on the intended settlement date and will impose a settlement discipline regime in the event of any trade failure as a result of inaccurate standing settlement instructions (SSIs). Quality data will also help firms with their regulatory and client reporting efforts, required across the entire firm and across all asset classes and lead to better operational risk analysis.

While the larger sell side firms have already begun the process of identifying quality issues within their data necessary for efficient post trade processes, it seems there are some smaller firms who are less aware of the need to implement data management disciplines. These firms should be mindful of the dangers of overlooking the importance of data quality not only to the accuracy of their risk and regulatory reporting but also to their clients. Failing to address this against a backdrop of so many new regulations coming into force in Europe, such as clearing and trade reporting under the European Markets Infrastructure Regulation (EMIR), transaction reporting under the Markets in Financial Instruments Directive (MiFID) and settlement measures under the CSDR to name a few, represents serious operational risk. For these firms, there is no better time to focus on data quality.

July 16, 2015

Brian Collings

Chief Executive Officer Torstone