What’s the big deal with data mesh at BigData LDN?
Is taking a ‘decentralised’ approach to data the perfect excuse for disorganised data laggards to latch onto? Or is it the solution to deriving value from data quickly and at scale?
What’s the big deal with data mesh at BigData LDN?
At a bustling BigData LDN this year, hosted at London’s Olympia, the hot topic that straddled many of the event’s twelve conference theatres was data mesh.
“Have you worked out what it is yet?” one delegate quizzed me, before a panel session began on what businesses need to know about this latest data buzzword.
Not wishing to appear ill-informed, I regurgitated the definition provided by the concept’s founder, Zhamak Dehghani, who derived the term in 2019 while working as a principal consultant at US tech consultancy Thoughtworks.
The ‘mother of mesh’ shared her definition with us during the conference’s keynote opener, two hours earlier.
“Data mesh is a decentralised sociotechnical approach to share, access and manage analytical data in complex and large-scale environments – within or across organisations.”
But what does this mean exactly? And taking a decentralised approach to data, when many firms have splashed out thousands on data lakes and data warehouses, does seem disruptive – it just goes against the grain of how many data practitioners operate.
Indeed, the delegate I spoke with, responded to my quoted definition cynically: “Isn’t that just the perfect excuse for organisations that have most of their data siloed still? Now they can just say that it’s a data mesh.”
And he had a point. Doesn’t a lot of data, especially if it’s not needed immediately, sit quite well in a centralised storage system?
Yet through the course of the day at BigData LDN though, I learned that Dehghani’s concept could well be the solution to some of the fundamental problems that enterprises face as they become more data driven.
The way data is managed now often requires moving it to a central location to be prepared for analytics. Not only does this slow data consumers’ ability to get the value out of the data they need quickly, but it creates a host of inefficient activities for IT, such as building and managing data pipelines, managing data copies and layering it all with the appropriate governance.
Dehghani argues that what this has led to is an architectural Tower of Babel
“That big, large, tall tower that tried to reach for heaven…but the problem was they didn’t all speak the same language which resulted in this confused architecture,” she said.
As a result, the data evangelist argued, existing approaches to managing data and getting value from it is plateauing – a plateau she measures in terms of the ratio of value to cost.
“The fact is that universal investment in data and AI is not reaping rewards,” Dehghani said during her keynote.
She quoted one survey which found that 62% of the top 600 companies are spending more than $50m on data/AI technologies but only 24% of them are seeing the true benefits.
“We’ve spent quite a lot of time and money optimising our data; addressing the diversity of data… but what we haven’t addressed is the scale of complexity of data in our organisations,” she told Big Data LDN delegates.
Dehghani argued that it was time to implement a more agile, forward-thinking concept that allows companies to take better advantage of decentralised data: the data mesh.
This approach to analytics goes beyond the centralised data lake and data warehouse models, focusing instead on a distributed model of architecture and multi-plane data infrastructure to get the best out of data teams.
One of the key components in this, she added, was putting the domain experts (the business) in an ownership role over their data products.
“It’s about taking a decentralised domain-orientated ownership approach to the data and architecture so that cross functional teams – the teams that are building the digital system, the teams that are delivering business outcomes, are responsible also for sharing data for analytical use uses,” she explained.
To split data into teams, she added – a change in mindset was needed which viewed data “not as an asset to be collected but as a product to share.”
She continued: “We need to make data feasible for domain teams to work with data products and to work on them and share them by elevating what we call data infrastructure,” she said.
Having clearly defined business goals and developing cross-functional teams that are mission-orientated is the ideal, said Dehghani.
As an example, she cited a playlist team at Spotify that wanted to share hyper specialised playlists for their audiences.
“They have some data to share, they have some applications also, but their key objective – no matter if they are working with data or applications – is to create this immersive, context-aware experience for their listeners,” she explains.
“And, if they need to make their playlist ‘smarter’ for a partner platform such as Peloton or some yoga studio – then they can peer–to-peer work with the partnership team whose outcome they are looking for is this integrated experience – no matter where the listeners are. So, each team is incentivised to provide data to improve the playlist experience.”
Pockets of industry are already experimenting with data mesh. According to Dehghani, brands warming to the concept include L’Oréal, Hello Fresh, Zalando and JP Morgan.
During this week’s data event two British institutions – the BBC and the RAF – also gave presentations on data mesh.
Public broadcaster BBC is currently examining how it can use the technology to help its stakeholders and its viewers retrieve data and make decisions more effectively.
According to the BBC’s director of product data, Jules Marshall, not taking a complete pivot but looking at incremental implementations is what big, legacy organisations should focus on.
During her presentation, Marshall – who has a history in building out data platforms for brands such as Just Eat and Nestle – said data mesh was a “big cultural shift” and a concept that needed “to be introduced over time.”
“The BBC is a hundred-year-old organisation and we’re very early on with our data mesh journey,” said Marshall.
“We’re approaching it by looking at our end users and examining the domain teams and their problems and starting from that point, before we even think about the technology – and then we’ll reverse engineer this into how mesh can solve these problems, “she revealed.
A measure of success with data mesh, she added, would be the ease with which it enables people within the business to do their jobs.
“This includes reducing lead-time from discovery to consumption of data – whether that’s an iPlayer’s personalised recommendation or in the news department, where editorial relies on data to make real-time decisions,” she explained.
The Royal Airforce is also undergoing a digital transformation programme – known internally as Project Wyvern – which is being led by data and digital consultant George Carter.
“The RAF is a big and complex organisation and data has never been strategically coordinated – although there are pockets of excellence,” noted Carter – who spoke over in BigData LND’s Data Mesh Theatre (and yes, the concept has now been given its own track).
According to Carter, one of the major problems with handing over responsibility for the data to business domains within the RAF, is that they’re starting from a very low skills base – with many data operators joining the forces as generalist comms and technical workers.
“Given this is the situation we’re creating data teams that support the functions of our organisation – but we need to equip them with skills they need first. So, there’s this trade off with the level of empowerment with the data that we give these ‘domain teams’ Vs the level of understanding they have,” he explained.
Cater revealed: “So the plan is to introduce the data mesh concept in incremental stages. First, we’ll allow them to choose their own products; then we’ll let them find and recruit their own teams.
“After this we will hand over the reins so that they are responsible for establishing their own buy-in and support from management and then finally, when we think they are ready, they will be able to choose their own tech platform,” Carter added.
Given that some big-name brands are already trialling data mesh, are we about to witness the fall of centralised data lakes and warehouses?
Not according to the mother of mesh, who stresses that the concept is still in its infancy. Ironically, one of the problems, according to Dehghani, is that its main flag wavers are the data analysts.
“That’s perhaps natural – but what it means is that it’s being owned and driven and budgeted by the chief data officers while the digital service teams are still fairly oblivious in terms of sharing and gaining ownership of the data,” she noted.
To hand over data product ownership responsibility to domain teams, she stresses that organisations need to move teams in a way that will then shape the architecture.
In terms of the technology, the data guru added that there were still plenty of gaps, despite the best efforts of a handful of software and cloud vendors including AWS, Google, Snowflake and Starburst, to make their products more compatible.
“Right now, it’s possible to have data mesh on a single cloud vendor but as soon as you move to on prem or to hybrid it ends. Likewise, data mesh can operate on a single vendor platform, but you can’t share data cross platform or hybrid – that’s a point of friction with some clients – they’ve started data mesh on one platform, but they want it to take it across to others…” said Dehghani.
Asked to give his prediction on where data mesh will be in 2025, Starburst chairman and CEO Justin Borgman said that it would likely be a more mainstream concept by then: “One sign of that will be that traditional data warehouse vendors – the antithesis of data mesh, will start talking about data mesh”.
Borgman agreed too with the comment that other companies, which are simply disorganised, will falsely claim to have adopted the data mesh concept and jump on the decentralised bandwagon.
The delegate who made this observation earlier appeared vindicated by the speaker’s comment.
But in the long term, the true winners of data mesh are likely to be those who manage to solve their data access issues by establishing business use cases early, creating the right ‘data’ products, securing buy-in from both management and the individual domain teams – and then shaping their architecture accordingly.
Subscribe to our Editor's weekly newsletter