Sector spotlight: Data in financial services
Data can be immensely powerful. To discuss its infinite potential, we invited experts from the Inflexion Network to speak with the portfolio in an informal, off-record discussion about how to make the most of the intelligence businesses have. We caught up with the speakers afterward to gain their insights on how data can improve businesses.
Banks’ longstanding lack of data harnessing has benefitted fintechs. Now they may shift from competition to cooperation, with consumers set to benefit from more personalised service. This is the thinking of Mark Greene, a senior financial technology executive specialising in big data and predictive analytics and newly appointed Chair of FBX Novantas.
Although the banking industry has inherently digital products which should be easily adaptable to a consumer-centric model, banks have not been capturing information about their customers’ preferences and transactions as fully as they might. Other industries, such as retail and travel have been better at developing personalised offerings,” according to Mark Greene, a serial Chair in the financial services space.
“Analytics can actually work really well in financial services, even better than in retailing, and next-generation fintech players are setting up to do this. Banks can do this, but fintechs are doing it faster,” he says.
Banks may have strong market share and big brand names, but the fintechs have been more agile and embracing of state-of-the-art analytics. As a result, established banks are investing in or acquiring these players, with Visa’s interest in Plaid and ultimate purchase of Tink last month testament to the allure of innovation to longstanding brands.
Joining those you can't beat
“Initially banks felt some fintechs were disintermediating the industry, but many seem to be accepting the approach of ‘If I can’t beat them, I’ll join them’. They’re co-opting the enemy, realising fintechs bring something to the table,” Mark explains.
Mark explains the new generation of fintech players know more about you than your bank knows about you by grasping the opportunities banks haven’t, enabling fintechs to offer customers better, more personalised service. By bringing this in-house, through partnership or acquisition, banks can buy-in the innovation they’ve been slow to develop organically.
He is confident banks will start to embrace a more customer-centric model.
Machine learning will be a big part of this. Says Mark: “Analytics are good at using past behaviour to suggest future, but it is not smart the way AI is and I think we’ll see banks invest more in systems which are self-taught. The notion of improving as you go will help banks learn how to treat customers.”
But whereas automation in many areas may spell the death-knell for humans, they should co-exist in harmony according to Mark. “I don’t see machine learning or bots replacing humans, as banking is a relationship service. You want to know a banker is there, you just want them to be more efficient. AI-assisted doctors is where healthcare is going; AI-assisted bankers is where banking is going. It should benefit customers and providers alike.”
Data offers infinite ways to improve businesses – but collecting and harnessing it correctly is key to maximising its benefits according to Robert Jeanbart, whose 30+ year career has been focused on international management and financial information. He now Chairs InFront, a European provider of financial market data and software solutions which was delisted from the Oslo Exchange following buyout by Inflexion.
For Robert Jeanbart, data is a raw material which each user can process to produce Information. He speaks from decades of experience in the space. “I dealt with financial data from every single angle. From acquisition to distribution, then to software applications, serving Front, Middle and Back Office. The challenge has always been to ensure data quality so that the resulting information is reliable.
Its myriad of end applications and use cases appeal to Robert’s personality. “I don’t like limits in life, and data offers infinite ways of producing Information. Just as an example, data derived from trading can fuel charting software, risk management software, historical databases, artificial intelligence predictive software, and so on.
His affinity for data may be down to spending 15 years at Reuters, a time which left its mark as he left to set up his own financial data company. Or it may be in his genes: His son is a data scientist who works with companies to help optimise their pricing. “They base it on the time of day, day of the week, week of the year, end use cases – you name it and they consider it for pricing.”
Devil in the detail
This goes a way to explaining the appeal of data to profit-hungry businesses, a hunger which is driving many to invest in data – but the enthusiasm is as important as the methods.
“The challenge is to make sure the data that we source is adequate and right. Because collecting data is mundane stuff but if you do it right then the quality and result of what you can do with it is invaluable. Data integrity, quality and timeliness, prevailing over speed, is key to making it usable. This is the great opportunity for people and firms right now.”
The first step will have been undertaken by most companies already, and this is digitising their data to be computer-readable. Thereafter lies the tricky bit: normalising, or standardising, it to be consistent and therefore efficient to process. To illustrate the importance of this, he compares it to customs at airports: before it was normalised internationally, customer agents had to flip pages to look for the relevant information as each country had slightly different passports. Once biometric passports were rolled out, consistency in presentation meant the process could become more efficient.
Companies need to have their data normalised to this level of detail – easy to read, interpret and process. In the financial services industry this is a big challenge. He stresses, “Most will have 100s if not 1000s of people doing this normalisation of their extant data to make it readable and processable. They will say it’s their biggest challenge now.”