Microsoft and Meta’s AI Partnership, Explained
News Briefing: It's Wednesday, July 19th.
Microsoft and Facebook parent Meta announced a new partnership on Tuesday. Microsoft will release an updated version of Meta's artificial intelligence language model called Llama 2. The companies said the model would be free to developers building software on Microsoft's Azure cloud computing platform. It wasn't the only AI news Microsoft released yesterday.
This partnership is putting Meta's large language model, Llama, onto Microsoft's cloud computing platform, Azure. So, basically, if you are a customer of Azure or interested in using Llama 2, you can now access that large language model on Azure. It's a way to distribute this large language model to developers through a large cloud computing platform. I do want to add that in the announcement, Meta is also making Llama 2 available on other services, including AWS. So it is not exclusive to Microsoft, but they are announcing Azure and Microsoft as its preferred Cloud partner.
And just to simplify that a little bit for people who may not be AI developers or as familiar with cloud computing, what exactly does it mean to put the model onto these Cloud Computing Services?
Basically, if you are a software developer, startup, or any company that uses Azure's cloud computing platform, you have the tools to be able to access this large language model. So, if you wanted to build a tool that can use this technology to generate answers to questions, summarize emails, or go through all of your company's data to pull out relevant information in the way that people have been using ChatGPT and other popular large language models, now Llama 2 is available to people that use Azure.
I want to talk a bit about why these two companies decided to partner. Microsoft gets another large language model or piece of technology that it can offer to its customers on Azure. Today, the biggest model it's offered to its customers has been OpenAI, which, of course, makes a lot of sense because OpenAI and its large language models like ChatGPT, GPT-4, these are strategically important to Microsoft because Microsoft has invested billions of dollars in OpenAI. So for Microsoft to have Llama 2 now as an option on Azure just gives their customers more options.
Okay, so Microsoft wants to offer more than just OpenAI, which it's invested in, on its Azure platform, and that's in keeping with what competitors like Amazon and Google are doing in the cloud space. But what about Meta? What do we know about its strategy?
For Meta, this is kind of an interesting step for them because the previous version of Llama wasn't really commercially available. It was a project that they had worked on internally that happened to leak and started becoming really popular with academics and people that were interested in toying with it, but weren't necessarily building a business on top of it. But then, with the announcement of Llama 2, it has shown that they want to make it commercially available.
Tom, this wasn't Microsoft's only announcement on Tuesday. It also released new pricing for using the AI tools on Microsoft 365. Can you tell us a bit about that?
Sure. So, Microsoft had unveiled this new tool called Copilot a couple of months ago, and it's built off of OpenAI's large language model. Copilot is a feature that lives on top of Microsoft 365, the software like Word and Excel and PowerPoint. Basically, what Copilot can do is, through natural language prompts, carry out tasks like summarizing all your emails on Outlook or taking a Word document and turning it into a PowerPoint presentation. In this announcement, Microsoft announced that it's going to be charging businesses $30 per user to access Copilot, and that's not cheap, frankly. At $30 per user, that is more than twice the amount of the cheapest tier that Microsoft has for 365 customers.
Okay, so we know the pricing - $30 per person, and that's per month. How is the market reacting to this news?
We got a sense as to how much value Microsoft is putting on top of this technology and this product. $30 per person shows that Microsoft is attaching a real premium to this technology, and assuming there's a huge amount of demand for it, it could drive really significant revenue for them. And we saw after the announcement, Microsoft stock went up five percent. This seems like something that more everyday users might gravitate to as opposed to maybe just developers. When will it be available to them? We don't know, and that's what's so interesting right now. They announced this product initially in March, now we have the pricing, but they haven't actually announced a date as to when they're going to make this available to general business customers. It's worth really paying very close attention to when Microsoft feels comfortable enough and confident enough in the Copilot technology to say, "Alright, we're releasing it widely" because Microsoft 365 is its marquee piece of software.



