Meet the UK start-up that wants to make affordable digital twins for SMEs
Last year Slingshot Simulations, a start-up spun out of the University of Leeds, won a grant to build a digital twin across all 12,000 square miles of Yorkshire – which has been hailed as one of the world’s largest digital twin projects to date.
The UK government project, managed through the Cabinet Office’s Geospatial Commission, was a collaboration between public and private industry, with partners including engineering and design company Arup, the BT Group and the city councils of Leeds, York, and Hull.
According to Slingshot’s CEO David McKee – who sits on the board of the global trade body, the Digital Twin Consortium – the “short but high paced project” was aimed at transforming how cities and countries make decisions.
From the best layout in a building to reduce Covid-19 infections to working out the most cost-effective route for goods from warehouse to delivery, McKee believes that digital twins have “a huge potential to improve lives, cut costs, improve public health, identify risks and reduce accidents”.
McKee also saw the project as an opportunity to extend the technology beyond its use in large corporations – making digital twins accessible to organisations of all sizes.
TechInformed caught up with McKee earlier this year at IOTSWC in Barcelona and discussed lessons learned in Yorkshire, realistic business models for SMEs and how digital twins are being used to help enterprises hit their ESG goals.
Why aren’t digital twins more widely used?
The concepts have been around for decades but there are three reasons why they don’t work for everyday businesses: cost, scale and complexity. Non-billion-dollar companies can’t invest in the hundreds and thousands of pounds needed for the expertise required to hire skilled experts or the $20K licences to access the software.
In terms of scale and complexity, if you want a digital twin of Leeds or Singapore you need some serious computational power and storage to be able to process that and the expertise to deal with that from a data science perspective.
And if we are modelling a city, is it the buildings, the utility networks, or the transport you are focussing on? Traditional tools take months to model, run and analyse which is not effective for small businesses.
Did any other firms participate in the Yorkshire project and what sort of problems did you help solve?
Firms joined at different stages – [data interoperability firm] Iotics and SatSense, a data company which specialised in analysing ground movements came on board during the project.
In terms of applications, one enterprise participant was the civil engineering firm Mott MacDonald Bentley (MMB), which wanted to carry out simulations on its Leeds office to understand occupancy levels in line with government Covid-19 guidelines.
They wanted to devise a way to take that data to make sense of it – to take the 2D information into a 3D model – which allowed us to carry out simulations. That way they could map the impacts that a return-to-work scenario would have in the event of a fire drill, for example [given the social distance requirements that were in place in 2021] at a 50% capacity, at 70% and at 100%.
We also helped the council with planning and lowering pollution levels. Leeds is home to the third most polluted road in the UK. I drive through it every day on my way to work. And it’s not nice. It’s right at the bottom of a hill so the pollution sinks, and it’s right underneath a train station, and it’s got four lanes of traffic queuing in it.
The local authority wanted to look at how they could manage traffic, where the flow is and where they could put cameras – so that was all about building a smart city platform to work out where we needed to collect data from; how we use a digital twin to understand what the current status is, but also to understand the dark spots and the gaps. It’s a recursive process as you go through the journey of building the digital twin system. It can take years.
Towards the end of the project, a major Yorkshire-based food manufacturer also joined and now we’re following up on another project for them. They want to use the output of the public data from the Yorkshire project to feed into their commercial operations.
Are enterprises permitted to use this type of data?
Yes, it’s open data, accessible to anyone. We operate on two tiers of data and we tend to use the term ‘community data’ rather than ‘open data’ because the latter scares some of our private customers. Community data can be from local authorities or from satellites. As a company, we specialise in using geospatial analytics which involves collecting, combining and visualising various types of geospatial data: GPS, location sensors, social media, mobile devices and satellite imagery.
We believe that better and faster decisions come from knitting together location data with the universe of proprietary and public sources that track millions of entities.
This information can come from Covid-19 cases, household incomes, shipping routes and firm-specific data — to be harnessed in a way that reveals new insights. The MMB project was quite a nice example of how we used a mix of open and proprietary data and built a different prototype to inform safeguarding.
What was the most challenging aspect of the Yorkshire project?
The biggest starting point challenge was the data, and access to the data. And there are lots of vendors using different platforms and none of them are interoperable.
We found that because of the way the procurement system works, the vendors who come with the best price for the next generation of sensors wins. But it meant that we had four different versions of ANPR [surveillance] cameras, different vendors, different data formats and so we end up having to deal with four different APIs.
When we think about digital twins, and particularly the reference architecture work that we’re doing with the Digital Twin Consortium, [to create templates for enterprise] it’s all about interoperability. That should be interoperability over the web – preferably using RESTful APIs. We don’t necessarily care about the content. We can figure out how to deal with content, but we do all need to be talking in the same language.
And then there’s authentication you need on both sides that’s required for enterprise integration – but that doesn’t scale. Not if you want to roll a digital twin model like Yorkshire across every city. And we’re in the process of talking to dozens of cities. And not just the UK, but cities in the US [in March Slingshot has opened a Boston Office], Europe and the Middle East.
Who’s the main point of contact when you’re looking to set up a citywide digital twin – local governments or enterprises?
You need stakeholders from both. Local governments typically own the land – they’re the asset owners in most cases, they have the budget for infrastructure. But while they might have a billion pounds to spend on a new bridge – so Hull City Council are building a new highway and the bridge is £350mn – they have no budget for digital. However, Arup, the contractor, has the budget for digital to help manage that project. So, the engineering firm pays us, but we deliver services to both.
So, it becomes an ecosystem where what we typically sell is an enterprise solution for the architects and the engineers, and that is high powered and high cost. But we then sell an affordable solution which is a more a simplified version. One that comes with no code and is easy for a hundred or so people at the council to use and we charge $10 a month making it no more expensive than Microsoft Office. So, then a digital twin becomes an easy conversation for them – and it becomes a true digital twin then because the data from both parties’ feeds into the whole ecosystem.
At IOTSWC this year you talked a lot about creating sustainable digital twins…
Consultants at EY have calculated those 175 zettabytes of data have been created from IoT – that’s a lot of zeros and only a fraction of that is ever used. If you view it from an IoT perspective, people will say about 50% of that data is used. McKinsey and IBM would estimate it’s closer to 10% to 20%. And we’re getting to the point where we’re generating more in the next three years and that’s going to increase fivefold – but the % we use will decrease to less than 5%.
The problem is that storing it all is a carbon impact. Now, the data we’re not using (sometimes referred to as ‘dark data’) accounts for 6.4 million tonnes of carbon just being on hard drives – that’s the same carbon impact as the smallest 80 countries combined.
As a business, we’ve identified two ways of offsetting that. The first is through creating a digital twin for sustainable projects – such as a micro wind farm or the big data for good work we’re doing with the Rainforest Trust.
The second is making digital twins that are sustainable in themselves. Encouraging more responsible computing – thinking about using renewable energy to power the data centres. One piece of IP that we’ve developed is a method of automatically scanning every piece of data we touch; looking for links and context. If there is none – which is probably the case with about 50% to 60% of data sets – then we ask the question, do you want to delete it. Sometimes they’ll push back and say ‘no, no, we don’t delete.’ People don’t like deleting things. But it saves money. Because you pay on storage. It also saves carbon and it contributes to a firm’s ESG impact.
What’s next for Slingshot?
We’re in the process of launching our enterprise-level digital twin platform in partnership with Microsoft Azure and AWS. It’s set to launch in September during the Leeds Digital Festival – and it’s aimed mostly at SMEs, with the aim to get it to a price point of $20 a month, where they’re able to literally log in with their Microsoft Enterprise account, their business account, and just go – there’s no hand holding… because that’s where we want to go.
Subscribe to our Editor's weekly newsletter