BCG study: Data costs & architectural complexity reach a tipping point
Steven Huels
Sr. Director, Cloud Services
Red Hat
Steven Huels
Sr. Director, Cloud Services
Red Hat
Share
More deployment options
It was with great pleasure that Red Hat participated with Starburst in cosponsoring a new market research report led by independent consulting firm, Boston Consulting Group, on the state of data and analytics in enterprises. Here at Red Hat, we’ve worked closely with Starburst to integrate their products and technologies into the product effort I lead, Red Hat OpenShift Data Science, a powerful AI/ML platform for gathering insights from federated data and building intelligent applications.
This six-week study titled, “A New Architecture to manage Data Costs and Complexity” explored the macro trends that are shaping the data and analytics market and examined how companies can derive more value from data as architectural complexity and costs continue to grow. The survey results were revealing; the research found that enterprise architectures are stretched to the limits, with more than 50% of data leaders highlighting architectural complexity as a significant pain point. Additionally, as data grows along with innovation across the data stack, the solution is not questioned as data and analytics are strategic imperatives, but costs of the solution can become enormous and untenable.
BCG’s research report includes more than a few surprising stats and projections, which suggest that change is coming.
50% of data that companies store is dark data
In 2021, 84 ZBs of data was generated. This rate is expected to rise from 2021 to 2024, reaching 149 ZB!
Unfortunately, of all the new data generated, very little is actually stored, from 6% in 2021 to 7% by 2024.
80-90% of what is stored is unstructured data. And the most alarming point is that 50% of that data is dark data, meaning that companies are not using the data to derive actionable insights to drive decisions.
Vendor proliferation is driving stack fragmentation and technological complexity at companies of all sizes
The number of unique data vendors has grown, tripling in the past decade (from about 50 to close to 150 today), driven in a large part by massive data stack investments, which total about $245 billion between 2012 to 2021.
With this growth comes challenges when choosing a data vendor. For example, some solution providers are transforming into more versatile platform providers, making it unclear which services a vendor offers. At the same time, the buying process has become decentralized, so more people across a company are purchasing products. This solution sprawl has led to tremendous architectural complexity and, ultimately, confusion.
Difficulty in attracting and retaining skilled workers
With the rise in data architecture complexity, BCG expects data-related people costs will double. Companies are going to need skilled workers who can actually extract value from all these solutions they’ve deployed and make sense of all the different platforms on the market.
This is a challenge today, and it isn’t going to get any easier. The difficulty of attracting and retaining skilled data talent was the biggest pain point for experts surveyed. As Pranay Ahlawat, Partner and Associate Director, Enterprise Software & Cloud at BCG conveyed in his recent Datanova session, “Many of these organizations will end up turning to consultants and systems integrators as a result, further compounding costs.”
Total cost of data is projected to grow 13% every year
Complexity and costs are essentially two sides of the same coin; what really stood out in the study and signaled that change is coming was the unsustainable data costs and architectural complexity.
In the next five to seven years, BCG expects the total cost of ownership of data to double. The main drivers will be spending on compute resources and people costs.
The reasoning is straightforward.
Declining cloud storage costs are encouraging companies to store more data. Yet, data is growing exponentially, the architecture used to manage data is becoming increasingly complex, and the talent required to run everything data-related is becoming harder to find.
The report notes: “As a result, many companies find themselves at a tipping point, at risk of drowning in a deluge of data, overburdened with complexity and costs.”
Nevertheless, the lesson isn’t about rising costs or even improvising data architectures to adapt to new types of use cases and data sources. In fact, the circumstances are encouraging (or demanding) organizations to rethink their data architecture and use a more comprehensive solution.
How will organizations respond?
Adopt a different approach to data architectures: Federated paradigm
Fortunately, this BCG study found that 56% of data managers are willing to boost investments and build new architectures. Known as data mesh or data products, these architectures can enable organizations to address challenges related to data access, data silos, and accelerated time to insight, while providing continued access to legacy data stores.
So, we may be at a tipping point and BCG’s study certainly suggests as much. Architectural complexity and rising costs may very well push this market into an entirely new phase, but we think it’s going to be a really exciting new era.
We’re looking forward to continuing to work closely with Starburst to help solve these data challenges as we approach this tipping point.
BCG Study: A new architecture to manage data costs and complexity
A new market research report, led by independent research firm Boston Consulting Group, explores organizational challenges related to exponential growth in data volumes and rapid innovation across the data stack.