Microsoft is banking on ‘bots’ becoming the new standard interface to interact with business processes and enterprise systems.
CEO Satya Nadella used an address to developers in Sydney today to highlight Azure for Bots, which will enable companies to build and deploy bots with “conversational and language understanding”.
Azure for Bots leverages Microsoft’s bot framework – released to Github earlier this year – as well as Azure Functions, Microsoft’s “serverless” compute platform which was made generally available today.
“Just like you built websites or mobile apps in the past, you’re going to now start building bots as new interfaces, new applications,” Nadella said.
“You’re going to take every business process and build a bot interface.
“But in order to do that you need to have natural language understanding, and that’s what the bot framework enables for every developer.”
While conversational bots have had some public stumbles this year, the enterprise potential of the technology was laid out last month when Woodside Energy provided a glimpse of its virtual avatar, Willow, which is built using rival technology from IBM.
Willow allows Woodside employees to effectively converse with a bot that then polls backend systems to find answers to questions.
Microsoft’s AI and research group engineer Lili Cheng today said Azure for Bots would come with “out-of-the-box templates”, including a “basic bot, language understanding intelligent service bot, form bot, and proactive bot".
While basic bots would be good for interfaces in websites, apps or communications channels, Cheng noted that additional work could enable more sophisticated use cases, such as what is being done at Woodside.
“Plug in Microsoft Cognitive Services to enable your bots to see, hear, interpret and interact in more human ways,” she said in a blog post.
“You can [also] enhance the capabilities of your bots with a just few lines of code using many other Azure services, too. For example, Azure Search makes it easy to add powerful and sophisticated search capabilities to your bots.”
OpenAI on Azure
Microsoft also revealed today that the Elon Musk-backed OpenAI would put its research workloads on Azure.
“OpenAI is an early adopter of Azure N-Series Virtual Machines, which will be generally available starting in December,” Nadella said in a blog post.
“These virtual machines are designed for the most intensive compute workloads, including deep learning, simulations, rendering and the training of neural networks.
“They also enable high-end visualisation capabilities to allow for workstation and streaming scenarios by utilizing the Nvidia Grid in Azure."
OpenAI hopes to safely push the limits of AI while opening access to the results of all of its research, thus potentially preventing advances from benefiting only a select few with deep pockets.
In its own blog post, OpenAI said it would soon begin research using the Azure cloud, and provide feedback to Microsoft “so that Azure's capabilities keep pace with our understanding of AI".
“In the coming months we will use thousands to tens of thousands of these [Azure N-Series] machines to increase both the number of experiments we run and the size of the models we train,” OpenAI said.