Designing for agile analytics

  • Adrian Estala

    Adrian Estala

    VP, Field Chief Data Officer

    Starburst

  • Brett Thebault

    Brett Thebault

    Co-Founder

    Aginic

Share

Data Mesh TV was live from Australia! I sat down with Brett Thebault, Co-Founder at Aginic & one of the leaders at the Mantel Group, at their headquarters to discuss his thoughts on data mesh transformations. Aginic was founded on the principles of agility and analytics and he offered insightful advice on how organizations should tackle their digital transformation programs. As Brett said, Aginic brings modern technology and innovation to bear on the customer’s business problems. 

As always, it was a fun episode, Brett and I discussed data vision, avoiding data cathedrals and we even had a good old fashioned boot competition. Listen until the end, to see whether the best boots are made in Texas (Lucchese) or Australia (R.M. Williams). You can watch the episode here and we summarize some of the high-points below. 

Enablement is key for a successful transformation

A good enterprise data transformation program includes an executive vision and a roadmap for developing the future data stack. But, a clear vision and even the most innovative data stack will fail without effective enablement. The enterprise will struggle to adopt new ways of working and the transformation program will fall short of the intended benefits, unless you make a strong investment in people. 

Brett noted that a transformation exercise should begin with a careful assessment of the current data team and that the roadmap should focus equal attention to building people skills and technology. Start with building a strong internal team, then empower them with the right tools and process. You should align your data analytics ambitions with your resource capabilities. 

In a data mesh, we want to create autonomous teams that can leverage technology and domain expertise to accelerate the delivery of data insight. If we can’t enable these domains to work on their own, we will just build an even greater dependency on the already overwhelmed data engineering team. We are well past the point when anyone can say that data analytics can only be done by data scientists. Business teams can learn to manage their own BI tools and data products, and that frees up the data scientists to focus on the more advanced analytics opportunities. 

Optimize your migration approach

We have learned a lot in the past ten years about how to optimize cloud transformation programs. In the early days, the cloud migration programs were focused on moving applications and their infrastructure stack. As our cloud migration strategies matured, we got smarter in selecting the right workloads and in redesigning applications to fully maximize cloud architectures. These experiences taught us how to optimize our migration programs to maximize cost, performance, reliability and security benefits.

Today, the focus is on big data migrations and we need to apply new optimization principles to be successful. Cost, performance, reliability and security are still a key part of the business case, but the most important metric now is accessibility. Brett and I discussed a few ideas that organizations should consider:

  • You cannot build a data cathedral, to centralize all of your data.. There are incredible benefits to leveraging cloud-based lake capabilities, and we can do fast analytics on a lake. But, many organizations have legacy data environments that can’t be migrated; data that needs to stay on-premise. Hybrid, decentralized architectures are the reality that organization’s are beginning to embrace. 
  • You can federate between cloud and on-premise data. Integrating data across different cloud environments or on-premise data sources is effective and performant. You can operate a successful data analytics capability in a decentralized environment. 
  • Data products can help minimize business disruption during a migration. Migration should take a balanced prioritization approach based on technical and business rationale. Use a data federation approach to create  a virtual layer between the source systems and the consumer. This layer will enable your consumers to continue to access and integrate the data that they need, while abstracting them from the data movement on the back end. 

Don’t chase the perfect data model

Building the perfect data model is not realistic. The consumer is always asking new questions and there is an infinite pipeline of new data entering your ecosystem. We need to embrace the pace of change. Agility can help your business to thrive with data analytics. We want to motivate curiosity, and we want our teams to become active self-service data citizens. 

Lake architectures drive an incredible amount of flexibility and cost efficiency. You can build agile data products that can help your business teams access every corner of that lake. With a data federation approach, your data products can be used to integrate data that sits in other lakes, warehouses, cloud environments or on-prem. The consumers will learn to reuse existing data products, share the products and rapidly ideate in the creation of new products or in the evolution of existing products.