In our first post on the impact of fund data, the argument was put forward that clean data not only improves sales efficiency but by reducing the amount of noise which arises when having to fix inaccurate data points, it can also lead to greater cost efficiency, leading to lower fund TERs.
But that’s only half the story.
The next consideration is using data for the myriad reporting obligations that asset managers face today. All of the reporting that a fund manager is obligated to perform is dependent on one critical resource: data. And in many respects, the lines between data and reporting are blurring.
When setting up the distribution for a fund, a whole series of data and reports have to be sent out but throughout the fund’s lifecycle there will be non-planned events to potentially deal with. For example, a manager might decide to market his fund into a new country, join a new distribution platform, or change the portfolio management team to name a few. Whatever it might be, the corresponding data has to reflect those changes to ensure consistency of reporting.
As such, having clean data sources can dramatically decrease the effort in producing the reports that depend on that data. Whether it is daily NAV dissemination, monthly investor reports, quarterly dividend reports, semi-annual reports, annual updates to PRIIP KIDs, Annex IV filings, AGM announcements, … there are numerous recurring or ‘planned’ reporting events to adhere to.
Dr. Livingstone, I presume?
Just as water is the giver of life, so data is a core element in supporting a healthy fund. However, one cannot drink dirty water and hope to stay healthy. It needs to be treated so that particles and bacteria are removed before it is safe to drink.
Fund data is much the same. The cleaner it is, the easier the fund manager can operate.
Back in the 19th Century, the famous Scottish explorer Dr. David Livingstone discovered the sources of the River Nile. It was an epic journey, fraught with danger. And in many respects, the challenge is no less daunting for asset managers when it comes to finding the true source of data.
The way to achieve this is to embrace the notion of ‘data ownership’. At KNEIP, this is a central tenet and one of the key tools we use to facilitate the process for asset managers. It is often surprising how few asset managers know where their data comes from. They might respond, ‘It comes from the audit department’, when in fact the audit department is sourcing it from the administrator, who in turn receives it from the Transfer Agency.
But one thing is certain. It’s only when one gets to the true source of the data that it can be cleansed and used to optimal effect.
The devil is in the detail
Of course the administrator will often set up the minimum requirements to get funds listed, but that is the extent of their remit. Because it’s not a priority, nobody then checks on it to make sure the data remains accurate over time.
Let’s say a fund buyer is interested in a fund, sees the name of a portfolio manager that he likes, but then later finds out that he’s actually left the firm 6 months ago. Or in making portfolio selections, prepares to make a substantial buy at a management fee of 0.9% when for the past 3 months it has been at 1.3%. Worse yet, the buyer notices it on two platforms where he finds the conflicting values. At best, his confidence in that fund is eroded (thinking “if they can’t get their data right, how are they going to manage my money”) and at worse, end in lost opportunities. These situations happen all too often.
By focusing on detailed processes to streamline operations, we act as an extention to the asset manager’s back office. For example, we use a ticketing system with several major data vendors to improve ongoing data accuracy. We’re able to clean the data in their systems when appropriate, and ultimately ensuring that the source of data remains ‘pure’ at all times on behalf of the asset manager.
This helps to reduce the noise with data vendors. When they call up the asset manager and say ‘We’re missing this particular piece of data’, the asset manager rarely has time to follow up properly. They might send a couple of emails to the administrator and then move on to more pressing issues. And that’s when the problems and delays with fixing inaccurate fund data arise. One of the industry’s largest asset management firms had a four-day average for resolving data problems. Our active follow-up and ticketing systems, we were able to reduce to it to 0.87 days.
Speed to market
When an asset manager wants to distribute a new share class, there is often a huge imperative to get the fund to market as quickly as possible, and in most cases, that means getting the fund’s Bloomberg ticker. Over the years, we have worked to demonstrate reliability in the quality of data that we disseminate, notably to Bloomberg, who (rightfully so) imposes strict validation steps before allowing any data onto their terminals. We have developed, with them, a portal which enables us to reduce this time frame to less than 24 hours and in some cases to less than half a day, enabling our clients to dramatically improve their time to market.
With more regulation coming down the pipe, it is becoming increasingly important to work with clean, consistent data to enhance business operational efficiency. All of the world’s great rivers depend on their source tributaries to flourish. Likewise, by tapping in to a reservoir of data, at source, fund managers have the potential to thrive and swell their assets. And become tomorrow’s rainmakers.
By Lee Godfrey