Going nuclear: Digital Transformation at EDF’s Hinkley Point C
The UK’s first nuclear power station in 20 years is also the first to be designed with digital in mind - involving integration on a mass scale and fusion of agile and waterfall delivery techniques
July 29, 2022
According to energy giant EDF’s digital architecture director Colin Inglis, power stations of the past were essentially built on paper.
“IT was different in the eighties and nineties when the last set of stations were built in the UK – it was all mainframe computing and large CAD systems with records stored on paper. If you go to stations that are still operating today there are rooms after rooms of binders containing records of construction,” he observes.
Over on the West Somerset coast, EDF is in the process of building the UK’s first nuclear power station in over 20 years, Hinkley Point C. And, according to Inglis, the plant is also the first to be designed with digital in mind.
“You already have PLM systems that manage all the data and processes at every step of a product’s lifecycle in oil & gas, aerospace, defence… but that’s not the case in nuclear. So we’ve been constructing a first-of-a-kind model and moving from a document-centric to a data-centric model,” Inglis explains.
The project is currently four years into a ten year build and is set to generate roughly 3200 megawatts of power, providing low carbon electricity for around six million UK homes, which amounts to 7% of the country’s total electricity.
On the face of it, Inglis, who’s responsible for building the digital architecture on the project, has been charged with a relatively straightforward mission: converting the build’s paper records into data. But the scale of the task at hand and the architecture involved is complex – and about to get more so.
“Up until about now we’ve been in the Civil space which, to put it simply, is about digging big holes and pouring concrete. And that’s a relatively simple thing to document,” explains Inglis.
“But as we move into the mechanical, electrical, and HVAC phase, which is about installing equipment into something like 3,000 rooms across two power reactors, the activity – and therefore the amount of data and the complexity of the integrations involved – is going to go up,” he adds.
EDF project entering construction phase
The magnitude of this challenge is enhanced he says, by the future proofing required for documents that are legally required to exist for at least sixty years.
“That’s at the forefront of everything that we do,” says Inglis, “because these plants can operate for a long time and inevitably questions will arise later about the viability of a particular part; or during an outage you have to be able to trace exactly what we did where and when or who welded a particular pipe…,” he adds.
Inglis estimates that each power station comprises, on average, of about 500,000 parts and about eight million individual components.
“So we have to track those eight million components through the supply chain. Not every single one but even down to some of the bolts which are serialised. Because 30 years from now someone’s going to say ‘this bolt is failing’…” he says.
Iterate to automate
EDF, which is being supported in its delivery journey by consultants at Cap Gemini, knew that its initial work would be extracting data from operational transactional systems and integrating it into a data lake – a process Inglis describes as “nothing much more than ETL”.
However, the firm knew that further down the line it would need to automate certain processes, and to support this it selected SalesForce-owned platform MuleSoft, that works as a unified solution for automation, integration and API management.
An early integration challenge facing the team was with legacy off-the-shelf software applications, which were not initially designed to work in the cloud.
“Each of those legacy COTS products has their own middleware layer embedded into the stack – so there was a complex web of adapters to build with ESBs in the middle so that the applications could interact,” Inglis explains.
According to Sean MacRae, a delivery director at Cap Gemini, the team also had to work on perfecting these adapters and building them into their solutions “to get that logic and to get the assets into the pipe for automation”.
He adds: “This is DevOps and everything that we do we’re learning as we go, as we automate certain things.”
Inglis admits that more agile ways of working are new concepts for Waterfall firms such as nuclear power – where project management is typically done sequentially.
“We have to be really careful how we interact with the engineers and other business users who have a mindset that everything we deliver has to be perfect; what we’re trying to do is deliver enough capability and then to iterate – and that does sometimes cause friction,” he admits.
The solution for MacRae and Inglis was to find a methodology that worked for everyone and that was customised to the environment of a power plant.
“What we came up with was an agile sandwich. The front end tends to be more traditional in terms of requirement gathering.
“Then there’s Sean’s team who do their agile thing – the iterations and the sprints and the testing – and then at the end we have a very structured testing methodology, which again, you’d expect in this type of environment,” Inglis explains.
Rightshore for right now
While EDF’s development and production for systems and parts is still handled in the UK, MacRae said that for certain capabilities it was necessary to implement what he refers to as the “Rightshore model”.
He explains this model involves creating hybrid teams from inside and outside the organisation that possess a mix of expertise to optimise delivery.
“We started off with a presence in the UK because of the sensitive data involved; then created hybrid teams outside and nearshore – in India and elsewhere – so that we are able to flex – and I think that’s incredibly important when you’re executing this type of design,” says MacRae.
Inglis adds that in some cases, there were such severe tech skills shortages that the Rightshore model was the only way to go.
“At times it’s been difficult to find certain types of integration engineers, for instance. We started with a UK security cleared team, but it was quite an expensive proposition and also inflexible and difficult to get the security clearance enough that you needed to do the tasks,” he said.
Inside a bunker as work progressed on the reactor buildings
“So we now have a model that we use in a lot of our digital delivery: where we leverage offshore teams and can outsource the technical expertise. In this case, in Hyderabad in India – but we also use teams in Portugal and elsewhere in India for digital delivery,” he adds.
Hybrid cloud
Another big-step change in the plant’s IT operations has been its cloud journey.
Hinkley Point C initially started off using multiple public cloud providers, but after it became apparent that they would need to exchange data, build integrations, and process information between solutions that were all based in the firm’s internal cloud, Inglis decided to reassess this approach.
Says Inglis: “It didn’t make much sense to keep shifting stuff out into the public arena and back into our internal networks. So we implemented a hybrid solution with MuleSoft that enabled us to move sensitive data and secure data within our secure cloud without it needing to go into external solutions.”
Due to the nature of the nuclear,Inglis adds that there are various security artifacts within the MuleSoft environment that create the data and ensure that it remains secure.
“The phrase we use internally is ‘secure by design’. So you have to design the security into whatever you’re doing.
“In each of our systems, we have something called a scheme of access, which describes those security controls: who can do what, or how the data is encrypted and how it gets moved from one place to another. And, as you can imagine, every device that we have is secured,” he adds.
In terms of a storage strategy, the power station still currently holds all documents relating to the build in their lowest tech common denominator – paper and PDFs – but there are also now digital records that are held in standard storage in the cloud, “with all the appropriate backups and all the appropriate evergreen-ess that you get with cloud”, Inglis adds.
However, he admits that predicting storage formats of the future cam be a matter of hedging bets. “It’s something that we’ll need to consider over the next small number of years. And the problem with a lot of that is, what’s that technology going to be in 20 or 30 years time?…
“So at the moment, we’re trying to make sure that we’ve got the digital artifacts with different types of characteristics and then placing some bets on the likelihood of reusability, readability, and future technologies. But it’s a real conundrum.”
*This article is based on a presentation given at MuleSoft’s Connect event in London on 30th June.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.