Microsoft’s UK CTO details enterprise AI uses
While the advent of cloud and low code no code have been transformational for businesses, the barrier to entry for consumer natural language technologies is so low that it is going to create use cases we can’t yet imagine, according to Microsoft’s UK head.
The tech giant has previously announced that it had integrated AI into its Microsoft Office products – with the launch of 365 Copilot back in March.
Now, speaking at the AI Summit held during this year’s LTW, Microsoft’s UK CTO Glen Robinson said that it would continue to “bake AI in” to all its tech stacks.
He promised: “Every single interaction and application is going to be augmented with some kind of AI system – from enterprise collaborations in Microsoft 365 to business systems like CRM, ERP and [enterprise resource tool] Dynamics 365 – as well as baking Copilot into low code no code to help build out automations.”
According to Robinson, AI is making efficiencies internally – as he revealed that 46% of the code that Microsoft now writes is automated using GitHub Copilot X with Open AI GPT-4, which also launched in March.
“It’s writing all the scaffolding that our developers used to have to sit down, and copy paste copy paste get wrong, fix, go back etc.
“If you’d have asked me last year that we have a way to give developers a 50% productivity improvement I’d have probably bit your arm off!” he added.
Robinson said that increasingly, firms would want to build their own Copilots using MS’s AI tech stack. He gave an example of an event planner who had taken a day off work and wanted to catch up on the meetings that she’d missed on Teams.
“She wants to catch up on a project, so Copilot goes across everything she has access to, all the data and look at the resources that have changed since yesterday and it provides a summary.
“Contextual natural language means it will be able to understand what is meant by ‘yesterday’ – so it’s going to create that summary and a sentiment analysis identifying where there might be problems with lead times which can also be added to the summary,” Robinson explained.
Besides code generation and summarisation using sentiment analysis, Robison added that other popular use cases would be in customer service/ call centre generation.
This involved taking real time transcripts from calls and running a sentiment analysis which can then automatically draft a customer follow up response so that when the agent finishes the call, they can copy and paste the response into Outlook
Robison said that his own favourite feature was the ability to create semantic search – integrating GPT into Azure to carry out cognitive searches.
“If you are able to create an index of documents that already exist within an organisation and feed that into the system for additional context and specific domain responses that are relevant to your organisation, that is going to be super helpful,” he said.
The Microsoft exec was a little scant on company specific examples – although he did say that he had seen one in healthcare use case “blew his mind” and that the sector was seeing significant advances in using it around medical imagery to help in the decision-making process.
One enterprise use case Robinson did mention was that of a 40-year-old Austrian construction firm Strabag International, which used Azure with Open AI’s semantic search capabilities to create risk assessment for a customer proposal.
This enabled the firm to create an index of all the previous documentation it had around projects of this nature. It then used a prompt asking, ‘What risks have I not factored in in this proposal?’
It came back with the suggestion that in that particular region, at that time of year, there was a three-month delay in erecting cranes – and stated the impact this would have on time and cost – which the firm was then able to factor into to the next draft.”
“In the process of risk management that is quite game changing,” Robinson added.
Subscribe to our Editor's weekly newsletter