Microsoft Launches Azure OpenAI Services at its Build Conference
Taps Meta as strategic cloud provider to advance AI innovation and deepen PyTorch collaboration.
There are so many tech conferences going on, it’s hard to keep track. #MSBuild is no exception there’s a lot going on. This is my take on a wrap-up of Microsoft Build from the perspective of artificial intelligence at the intersection of business, mostly relating to Azure.
The content around the event is a little bit all over the place! This is not my fault. But I will do my best to highlight stuff that I found interesting.
What the Heck is Azure OpenAI
So at the Microsoft Build conference on May 24th, 2022 Microsoft basically announced that Azure OpenAI Service is now available in a limited access preview. Customers who want to use the service can apply for access. Introduced at Microsoft Ignite 2021 as a new product within the Azure Cognitive Services family, a part of Azure AI, Azure OpenAI Service was previously available by invitation only.
It’s not great wonder, since about a year ago they launched Azure OpenAI Service, a fully managed, enterprise-focused product designed to give businesses access to AI lab OpenAI’s technologies with added governance features.
Initially it was invite-only as a part of Azure Cognitive Services, the service allows access to OpenAI’s API through the Azure platform for applications like language translation and text autocompletion.
So this will sound like a Press Release, but I do think it’s notable for how the Cloud is integrating A.I. and its integrations better. OpenAI is an incredible A.I. lab and it has a deal with Microsoft.
This partnership has enabled Transformers to find more product-market fit in the larger context of the business world. Azure OpenAI Services might improve with the R&D going into this that’s somewhat exciting for A.I. enthusiasts like me.
Fully managed models
Built by OpenAI, the models powering Azure OpenAI Service, including GPT-3, can be customized to perform tasks, from translating natural language into code to generating answers to questions.
GPT-3 has been publicly available since mid-2020 through OpenAI’s API. But Azure OpenAI Service adds corporate-tailored layers on top of the models that the API doesn’t, including greater scaling capacity, private networking and access management.
Satya Nadella has brought A.I. to the Cloud and it’s very important to the future of business how strategically he’s re-vitalized Microsoft since Azure has grown up. If you are interested in hearing the keynote to Microsoft Build it’s worth a look here.
I think what I like about Microsoft’s approach to AI on the Cloud is its viability and product-market fit.
Meta selects Azure as strategic cloud provider to advance AI innovation and deepen PyTorch collaboration
When a company as big as Meta partners with Azure to advance AI innovation it’s sort of a big deal as well.
On May 25th, 2022 Microsoft announced that Meta has selected Azure as a strategic cloud provider to help accelerate AI research and development.
Microsoft is Commercializing Bleeding Edge A.I. in the Cloud
Companies can now use Azure OpenAI Service to run models in particular geographic regions for compliance reasons or centrally manage API endpoints and use customer-supplied keys for encryption.
Azure OpenAI Service also ostensibly makes billing more convenient for existing Azure customers by charging for model usage under a single bill, versus separately under the OpenAI API.
Nvidia appears to be the underlying hardware provider behind a lot of what Meta will be doing with Azure in A.I. Meta will expand its use of Azure’s supercomputing power to accelerate AI research and development for its Meta AI group. Meta will utilize a dedicated Azure cluster of 5400 GPUs using the latest virtual machine (VM) series in Azure (NDm A100 v4 series, featuring NVIDIA A100 Tensor Core 80GB GPUs) for some of their large-scale AI research workloads.
In 2021, Meta began using Microsoft Azure Virtual Machines (NVIDIA A100 80GB GPUs for some of its large-scale AI research after experiencing Azure’s impressive performance and scale. With four times the GPU-to-GPU bandwidth between virtual machines compared to other public cloud offerings, the Azure platform enables faster distributed AI training.
Microsoft’s Pillars of AI Innovation
Models as Platforms
You can see 3-min videos on each of them here. So why is this important? Microsoft is building an architecture for A.I. to evolve in the cloud and various use cases of why it can help businesses. It’s building better A.I. into its apps, software and the problems and issues real customers are facing to solve business problems.
Co-Pilot Free for Students
Co-pilot will now be free for students. It’s a start. Copilot is powered by an AI model called Codex that’s trained on billions of lines of public code to suggest additional lines of code and functions given the context of existing code. Copilot will specifically be available free for students as well as “verified” open source contributors.
At its Build developer conference, Microsoft also announced yesterday what it calls its “Microsoft Intelligent Data Platform.” That’s not so much a new platform but an effort to bring the company’s existing database, analytics and governance services closer together. You can read about it here.
I like how Azure OpenAI is evolving.
Another new collection of models in Azure OpenAI Service, called embedding models, are turned to perform well on tasks like text similarity, text search and code search. Text similarity captures the semantic similarity between pieces of text, while text and code search find information in files to satisfy a set of criteria.
The GPT-3 models can understand and generate natural language. The service offers four model types with different levels of capabilities suitable for different tasks. Davinci is the most capable model, and Ada is the fastest.
I find this fairly interesting.
Davinci is the most capable engine and can perform any task the other models can perform and often with less instruction. For applications requiring deep understanding of the content, like summarization for a specific audience and creative content generation, Davinci is going to produce the best results. These increased capabilities require more compute resources, so Davinci costs more and isn't as fast as the other engines.
Another area where Davinci excels is in understanding the intent of text. Davinci is excellent at solving many kinds of logic problems and explaining the motives of characters. Davinci has been able to solve some of the most challenging AI problems involving cause and effect.
Use for: Complex intent, cause and effect, summarization for audience
Curie is extremely powerful, yet very fast. While Davinci is stronger when it comes to analyzing complicated text, Curie is quite capable for many nuanced tasks like sentiment classification and summarization. Curie is also quite good at answering questions and performing Q&A and as a general service chatbot.
Use for: Language translation, complex classification, text sentiment, summarization
Babbage can perform straightforward tasks like simple classification. It’s also quite capable when it comes to Semantic Search ranking how well documents match up with search queries.
Use for: Moderate classification, semantic search classification
Ada is usually the fastest model and can perform tasks like parsing text, address correction, and certain kinds of classification tasks that don’t require too much nuance. Ada’s performance can often be improved by providing more context.
Use For Parsing text, simple classification, address correction, keywords
Azure’s Commitment to Responsible AI
Lastly, Azure OpenAI Service has a new “responsible AI” system designed to filter out content related to sex, violence, hate and self-harm.
The system attempts to detect patterns of abuse or harm by a user of a model, which, when spotted, prompts a dedicated Microsoft team to work with customers to investigate and block the abuse if needed. The team is also responsible for updating the content filters as new forms of hate speech and slurs come into use.
“The new responsible AI system in Azure OpenAI Services includes automatic content filtering to deliver higher-quality content for customers. For APIs where generation of harmful content is a concern, it is on by default. Customers cannot opt out,” a Microsoft spokesperson clarified via email. “The content filtering is aligned with Microsoft’s content policy that governs use of Azure OpenAI Service.
Meta and Microsoft Partner on PyTorch Adoption
In addition, Meta and Microsoft will collaborate to scale PyTorch adoption on Azure and accelerate developers' journey from experimentation to production.
Azure provides a comprehensive top to bottom stack for PyTorch users with best-in-class hardware (NDv4s and Infiniband).
In the coming months we are told, Microsoft will build new PyTorch development accelerators to facilitate rapid implementation of PyTorch-based solutions on Azure. Microsoft will also continue providing enterprise-grade support for PyTorch to enable customers and partners to deploy PyTorch models in production on both cloud and edge.
Why It Matters?
By scaling Azure’s supercomputing power to train large AI models for the world’s leading research organizations, and by expanding tools and resources for open source collaboration and experimentation, Microsoft may be able to help unlock new opportunities for developers and the broader tech community, and further their mission to empower every person and organization around the world.
It’s moreover I think a sound strategy. It’s a little bit on the dry side, but it’s nice to keep track of what’s going on in the Cloud with A.I. integration. Microsoft has a Microsoft Developers page on LinkedIn that I follow. This is not a sponsored post, just a wrap-up of the PR that I want to follow.
Microsoft teaming up with OpenAI and now Meta, just makes sense for bringing AI-as-a-Service more horsepower on Azure and deepening PyTorch collaboration is a nice start.
How Microsoft is working with large AI Models is most interesting and illustrates how the Cloud is evolving.
While you guys rarely click on my links, you can watch the Azure AI keynote opening here. I can only try my best to provide educational content for your benefit.
Join 71 other paying subscribers to get access to all the A.I. articles I write, You get about 50% more content that deals with A.I. at the intersection of business, news and technology. This money you give goes to my basic needs as a Creator in the form of rent, food and giving money to my wife for our basic expenses.
I’m taking some care to provide coverage from major AI Labs such as a Microsoft Research, Meta AI, OpenAI, DeepMind, Google Brain and so forth, among others.