7 key takeaways: The future of financial services with distributed data

  • Manveer Singh Sahota

    Manveer Singh Sahota

    Director of Product Marketing

    Starburst

Share

At this year’s Datanova conference, Starburst Co-Founder and CEO Justin Borgman sat down with Jay McCowen from Bank of America, Jos Stoop from Wells Fargo, and Chaitanya Geddam from Accenture, in our The Future of Financial Services with Distributed Data session.

To no one’s surprise, distributed data architectures aren’t going anywhere — something we explored in a prior blog post. Furthermore, the consensus in the talk was that financial services organizations should embrace data and analytics strategies that optimize for the quickest path from data to insight by enabling secure and well-governed access to ready-to-use data, wherever it lives.

This blog explores the seven key takeaways from the fireside chat – yes, there are seven. The takeaways highlight how institutions like Bank of America and Wells Fargo are evolving their data and analytics strategies to adapt to changing customer expectations, combat the increasingly complex web of financial crimes, and increase productivity while managing costs and risks more effectively.

#1 — Data architecture movies — same story, different actors

We’ve seen this centralization movie before with Teradata, Hadoop, cloud data lakes, and cloud data warehouses. There is a reason why the reboots continue to be made; businesses are naturally inclined to bring all the data into one spot to build this massive centralized data store as the universal source of truth that becomes centrally managed and governed.
But as Jos Stoop of Wells Fargo put it, “we’ve been in the industry for a long time, we see this over and over again, and I think we all know multiple of these projects that have failed, they were never complete[d].” It’s a predictable storyline, but we can’t get enough of it for some reason.

“It makes sense to leave the data where it is and provide a single access to that federated data, particularly when the data is in lots of different locations and environments.”

Jos Stoop, Wells Fargo

In the case of global financial services companies, depending on the use case, this can be upwards of tens of thousands of data sources. Getting all this data into a single place can take years and significant resource investment.

If only a movie with a story plot allowed organizations to identify, access, and use all the relevant data where it lived. If only downstream teams like Jos Stoop manages could connect their BI platforms directly and enable their end users with true self-service analytics capabilities.

#2 — Performant analytics scaled to the edge

It’s fair to say that the average banking customer has more analytics capabilities in their mobile app than an average employee does to do what’s required beyond their job. However, empowering employees with data-driven intelligence is more realistic with today’s technology than before.

“I think we see a real trend in bringing the analytics to the end users rather than pre-rehearsed dashboards where they can only get the answers that people have developed for them,” Jos Stoop.

Providing employees with the ability to truly self-serve and find answers and insights about everyday operations “will free up a lot of the developers for an analyst for higher value type[s] of dashboards. So this whole self-service trend, I think, is a really important part,” continues Jos.

But it’s not as easy as just giving access. One of the selling points of a centralized data store is it can also provide better performance hence why organizations invest data engineering resources to copy and move data around to a local platform to ensure dashboards are fast to refresh and simply just perform. However, it’s no longer a requirement to have a centralized data store to gain high performance. The access layer, the query engine, has to be performant.

“ We’re no longer just showing top-level dashboards. There’s data; everyone wants to get down to the individual transaction, the individual record layer,” says Jay McCowen.
This increases the pressure on technical teams trying to remove this low-task work from their plates and enables end users with real-time insights to ensure both access and performance are addressed.

#3 — Shift from centralized data governance to democratized data ownership

Imagine a scenario where data owners and creators across the organization take responsibility for that governance, the lineage, the access, and ultimately productizing the data for broader consumption. For many, this type of thinking has red flags across the board. However, there is a need, and organizations and concepts like data mesh are focused on bringing this to fruition.

“On the technology side, you have your very traditional stack. You have to have your data stores, understand your data domains, and ensure those data domains exist. You have to have your metadata in your data catalog so that data can be discovered.

And then where we’re moving towards is making sure that that data can easily be accessed once it is discovered and having the technology components available for performant data access.

Then with those technology pieces in place, shifting the mindset from enforcing governance to encouraging ownership and encouraging those data owners that now have their data being discovered and have it being used, how do they control that use and how do they enable their end customers of their data to use that data in new and exciting ways?”

Jay McCowen, Bank of America

Shifting to an ownership approach can also influence data quality. “What we’ve seen with data quality is as data gets used more, data quality goes up, and you kind of hit a virtuous cycle, and then better quality data gets used more” Jay McCowen. This can also be applied to ownership and governance. Jay continues, “as data owners take responsibility for their data and present their data to customers, their customers, the data customers will create better data quality, but they will also encourage the data owners to better govern that data.”

#4 — Reduce the burden of ETL

Extracting, transforming, and loading (ETL) data is optional for every data project. One of the challenges financial services face is integrating data from hundreds and thousands of different data stores without moving or copying them.

Jay McCowen of Bank of America goes as far as to say, “[get] rid of those data pipelines and those ELT pipelines in favor of more immediate real-time access.” Data movement is costly, time-consuming, and risky.

“And technology teams have better things to do than move data from system A to system B and, to make it accessible,” continues Jay McCowen. It can introduce latency, inconsistency, security breaches, and compliance issues. Moreover, data movement can further create silos that prevent a holistic view of the business and limit innovation.

#5 — Cost reduction is only part of the modernization equation

The general assumption is that organizations frantically look for cost reductions when the economy is on a downward trend. There are three factors, cost, market growth, and compliance. As Chaitanya Geddam at Accenture puts it, “these are the three things that customer[s] look at right when they actually adopt new technologies. But [the] cost is one of the shiny objects we all focus on, which is very important because you see the value realization much quicker than the other two.”

But the cost isn’t some linear or binary factor. It’s about resource costs, hardware, data movement and management, and opportunity cost. So when making cloud modernization investments, you can’t simply look at cost. Jos believes “the cloud journey is not necessarily about cost savings; it’s more about the benefits and growth it provides.”

Market growth, the second factor, is a given whether its good times or bad. So how do modernization investments contribute to market growth during uncertain times? According to McKinsey 2022 Global Annual Banking Review, since the last financial crisis, the gap between the banking industry’s leaders and followers (as measured by total returns to shareholders) has widened to 5x between leaders and the bottom decile, and 3x between leaders and the average financial institution. Even more noteworthy is 60% of the performance gap was realized within the first two years of the recovery. As the current economic situation unfolds, if history is any indicator, leaders may emerge again based on the market growth-focused investment they make now.

Lastly, risk, it’s an evolving landscape, and factors like regulatory reporting, financial crimes, liquidity concerns, data governance, and privacy laws, and the increasing focus on environmental, sustainable, and governance (ESG) regulations, specifically in the European Union, create multiple avenues for increased exposure with data movement and duplication.

So ultimately, as organizations look at their modernization efforts, “cost is certainly a factor, but also what you get, the benefits out of the cloud [are] immense as compared to some of the cost,” Chaitanya Geddam.

#6 — It’s not just financial risk; it’s also about operational risk

From a risk perspective, one of the things that financial institutions focus on besides just financial risk is using data to quantify operational and regulatory risks. So, for example, Jay McCowen shared, they are finding ways to dig into [their] processes and the controls of those processes and extract that data, which is then tested to validate whether or not those processes are under control. By doing so, they can also use that quantified proof to meet their regulatory requirements. So it helps to solve regulatory risk and operational risks for them.

#7 — Culture eats any well-thought-out data strategy

Yes, this is where change management comes in. So while you’re modernizing the technology infrastructure and enabling all the cloud-powered self-serve capabilities, organizations can’t ignore people and processes. These two components are critical to successfully building a data-first culture.  As Chaitanya Geddam sees it from his experience working with financial firms of all sizes, “data is in the boardroom. And change management while you’re building all this cannot be a day plus 90 days of activity; it has to be a day plus one activity. And what I see in the industry is not enabling that on day one has been a bigger challenge while you’re building this foundational aspect of it.”

A component of ensuring successful change management can be data products as a mechanism to “bring in the trust of the data across different lines of businesses, even though it is federated,” continues Chaitanya.

Creating an environment that builds, leverages, and thrives off data products – productized data sets that provide a single version of the truth – becomes an integral part of the culture equation. This is not to say that data products or data as a product will solve all the challenges of building a data-first culture, but the practice can make it sticky.

Bringing it home — delivering tangible business impact from distributed data

So where do we go from here? For starters, there is nothing wrong with a centralization data strategy to a cloud data lake. It’s only an issue if your opportunity cost is massive because the data can’t be fully used until the migration is done. Instead, whether it’s your primary strategy or a supplemental strategy on your journey to centralization, a functioning decentralized data strategy is possible.

Starburst helps leading financial institutions, including the top 4 of 6 North American banks, accelerate time to insight to deliver tangible business impact from their distributed data. By gaining practical business impact from their data through use cases such as performant next-best offers, complete client 360, regulatory reporting, and anti-money laundering, financial firms can reduce costs more effectively, improve performance, mitigate risks, and foster innovation.

Recap of 7 takeaways — How leading banks are evolving their data and analytics strategies

  1. The push to a ‘new’ centralized data storage platform isn’t new; vendors have been pushing this for over 30 years – the same movie script with different actors and some cool new CGI.
  2. There is an increasing focus on empowering front lines and end users with advanced analytics capabilities at the edge.
  3. Embrace concepts like data mesh to decentralize data ownership to the data owners.
  4. Focus on improving data team productivity by reducing unnecessary data ETL.
  5. Cost reduction is only part of the modernization equation; market growth and compliance requirements/considerations exist.
  6. Besides financial risk, financial firms must also focus on using data to quantify operational and regulatory risks.
  7. While you’re modernizing the technology infrastructure and enabling all the cloud-powered self-serve capabilities, organizations can’t ignore people and processes.

To learn more, visit Starburst for Financial Services and watch the entire fireside chat with Jay McCowen from Bank of America, Jos Stoop from Wells Fargo, and Chaitanya Geddam from Accenture.

Starburst accelerates business for Financial Services

Top financial firms give their teams the freedom to be curious with the fastest path from data to insight

Learn more