Article is based on an in-depth conversation between Ryan Cuthbertson, global head, custody and funds products, financing and securities services at Standard Chartered, and Ashley Payn, director, data management, analytics and innovation at Royal Bank of Canada.
TRADITIONALLY bound by legacy technology infrastructure and unstandardized datasets, custodians have looked to a variety of solutions to improve operational efficiency within their respective organizations to add value for their clients.
One of the goals is to reduce the likelihood of a failed trade as a result of intermediation lag among parties. “The challenge around transferring data across multiple parties is the replication and processing that is required to get from one part of the world to another to make a trade or investment successful,” explains Cuthbertson.
In addition, a lack of coordination among parties along the value chain could lead to post-trade inefficiencies, points out Payn. “Practically speaking, we have a lot of parties with different systems and architecture all trying to connect together,” Payn adds. “That’s not so easy. We also have an industry that still lacks standards. If you work in the data world, that’s quite a challenge. Along with that, you also have a variety of parties, all of whom have different levels of digital maturity. We, for example, would prefer APIs [application programming interfaces] as a method of data transmission.”
Standard Chartered has focused on the concept of data co-creation or more effective sharing of information between parties across the value chain to speed up post-trade processing. The bank, for example, recently worked closely with a South African client on an integration exercise to improve its data conversation by adding six additional data points. As a result, trade-related client queries were reduced by 80%.
“It’s really about how we make sure that our clients and clients’ clients get the right information at the right time to enhance the overall experience, reduce risks and reduce costs,” Cuthbertson states. “We work closely with our custodian partners to understand what they are doing with the information we provide to them. Once we have a good picture of how they are replaying that information to their respective clients, we can consider how additional data – above and beyond what’s prescribed by market infrastructures – can start to drive value and enhance the client experience.”
It’s a similar approach for Payn, who highlights that asset managers will work with custodians if their value-added service has helped them save time and money, reduce risk and generate more profit.
While effective data usage between parties can have its advantages, there are a number of questions that need to be addressed, such as data ownership. “Companies are trying to monetize their data assets right now,” Payn explains. “When data is co-created, we then need to establish what exactly the data ownership model is and whether it is licensed or public domain. Along with ownership, you have the practical considerations of roles and responsibilities. If data is co-owned and there are errors in that data, who fixes it?”
Nevertheless, the key to implementing a data co-creation scheme is having a calibrated and measurable outcome. “There is an element of corporate inertia that can exist when two banks are trying to do this,” Cuthbertson shares. “Pragmatism is an absolutely essential element in co-creation. You can start with something small just to prove you can work together. And once you’ve proven success, the ability to ask for more investment becomes easier for larger- scale projects.”
This mindset is echoed by Payn who believes that proof points are needed to justify data co-creation. He notes: “We like to think big, start small and learn fast, understanding through measurable points how we can reduce latency and failed trades.”
Going forward, the importance of close conversation and collaboration in post-trade activities will be even more critical as institutions strive for better operational services.