TWIL: July 2, 2023

This week learn about Microsoft’s new LLM – Orca – and how to get ready for Microsoft 365 Copilot. Check how to create an enterprise architecture to leverage Azure Open AI Service, and the new models just released in the service. Read all about grounding LLMs, new Point-In-Time Restore features of Azure Cosmos DB, and Satya Nadella’s bet on AI. Finally, listen to the awesome conversation between Lex Fridman and Mark Zuckerberg and start learning about Data Science, Machine Learning and AI with free Microsoft courses made available through GitHub. Enjoy!


Lex Fridman Podcast

Episode 383: Mark Zuckerberg: Future of AI at Meta, Facebook, Instagram, and WhatsApp
Lex Fridman, a futurist and entrepreneur, interviews Mark Zuckerberg, the CEO of Meta, about the future of AI at Meta, Facebook, Instagram, and WhatsApp. They discuss topics such as: the AI and open source movement and how Mark supports it, the next AI model release and what to expect from it, the future of AI at Meta and how it plans to use it for social good and innovation, the bots and how they will change the way people communicate and interact, the censorship and how Meta deals with it ethically and legally, the Meta’s new social network and how it aims to be more inclusive and diverse, Elon Musk and how he influences Mark’s vision and strategy, the layoffs and firing and how Mark handled them with transparency and grace, the hiring and how Mark attracts and retains the best talent, the Meta Quest 3 and how it will be a more immersive and interactive experience, the Apple Vision Pro and how it will enhance the camera capabilities of Meta devices, the AI existential risk and how Mark thinks about it and prepares for it.

Microsoft Orca

Microsoft’s Orca SHOCKS the entire industry – STUNNING GPT 4 competitor
Microsoft ORCA is a stunning new development in the field of AI. This new model will be open-sourced and revolutionise the AI industry.

Orca: The Model Few Saw Coming
The first model set to be opensourced that actually comes close to ChatGPT, and is just 13B (that’s small enough for a laptop). The 51 page report from Microsoft was released just 48 hours ago but I have gone through it all, and bring relevant insights from 5 other papers. By imitating the logic and explanations of GPT 4 (and using GPT 3.5 as an assistant), as well as by training on diverse tasks and an order of magnitude more examples, we have Orca. I will showcase it on a dozen benchmarks and go through in detail how it works and why. I will also end on comments from Sam Altman and Ilya Sutskever on whether Opensource will catch-up…

Microsoft 365 Copilot

How to get ready for Microsoft 365 Copilot
Get information ready for search, put key prerequisites in place, and assign licenses to prepare for the next transformation in how we work with Microsoft 365 Copilot. Copilot leverages large language models that interact with your organization’s data using the Microsoft Graph to generate personalized experiences with related context, reducing the steps to find the information you need and generate content. By design, Copilot respects user-specific permissions to any content or information it retrieves, and only generates responses based on information that users explicitly have permission to access.

Azure Open AI Service

Enterprise Azure OpenAI
Repository detailing the deployment of an Enterprise Azure OpenAI reference architecture. Includes comprehensive logging of Azure OpenAI model execution tracked to Source IP address, advanced usage and throttling controls, high availability of the model APIs and secure use of the service.

Azure OpenAI Service models
Azure OpenAI is a service that provides access to many different models for natural language processing and other tasks. The models are grouped by family and capability, and have names that indicate their relative power and cost. Users can request access to the models they need, and update their deployments automatically or manually. They can also fine-tune their models for specific applications. The models have different parameters, input types, and expiration dates that users can check and manage through the Models List API and the REST API spec. Users can also migrate to the latest versions of the models, such as text-embedding-ada-002, which provides parity with the previous model.

Introducing new and updated models to Azure OpenAI Service
Azure OpenAI Service, a service that offers large language models (LLMs) for natural language processing, has introduced new and updated models to its customers. The updated models have lower prices than the previous versions, ranging from 75% to 25% depending on the model. Customers can choose whether to auto-update to the new versions or to pin them to a particular version. They can also update the model version in Azure AI Studio or via the API.

Generative AI

The economic potential of generative AI: The next productivity frontier
Generative AI is a step change in the evolution of artificial intelligence. As companies rush to adapt and implement it, understanding the technology’s potential to deliver value to the economy and society at large will help shape critical decisions. We have used two complementary lenses to determine where generative AI, with its current capabilities, could deliver the biggest value and how big that value could be.

Grounding LLMs
Grounding is the process of using LLMs with information that is not available in their trained knowledge, such as search queries, user instructions, or external data. This helps LLMs to produce more accurate, relevant, and coherent responses. The article discusses two main techniques for grounding: retrieval-augmented generation (RAG) and fine-tuning. RAG involves fetching, merging, and augmenting relevant content from a trigger and a LLM, while fine-tuning involves adding more information to a LLM after training.

Azure Cosmos DB

Introduction to provisioned throughput in Azure Cosmos DB
Azure Cosmos DB allows you to set provisioned throughput on your databases and containers. There are two types of provisioned throughput, standard (manual) or autoscale. This article gives an overview of how provisioned throughput works.

Online backup and on-demand data restore in Azure Cosmos DB
Azure Cosmos DB automatically takes backups of your data at regular intervals. The automatic backups are taken without affecting the performance or availability of the database operations. All the backups are stored separately in a storage service. The automatic backups are helpful in scenarios when you accidentally delete or update your Azure Cosmos DB account, database, or container and later require the data recovery.

Continuous backup with point-in-time restore in Azure Cosmos DB
Azure Cosmos DB performs data backup in the background without consuming any extra provisioned throughput (RUs) or affecting the performance and availability of your database. Continuous backups are taken in every region where the account exists.

Restoring deleted databases/containers in the same account with continuous backup in Azure Cosmos DB (preview)
The same account restore capability of continuous backup in Azure Cosmos DB allows you to restore the deleted databases or containers within the same existing account. You can perform this restore operation using the Azure portal, Azure CLI, or Azure PowerShell. This feature helps in recovering the data from accidental deletions of databases or containers.


Data Science for Beginners – A Curriculum
Azure Cloud Advocates at Microsoft are pleased to offer a 10-week, 20-lesson curriculum all about Data Science. Each lesson includes pre-lesson and post-lesson quizzes, written instructions to complete the lesson, a solution, and an assignment. Our project-based pedagogy allows you to learn while building, a proven way for new skills to ‘stick’.

Machine Learning for Beginners – A Curriculum
Azure Cloud Advocates at Microsoft are pleased to offer a 12-week, 26-lesson curriculum all about Machine Learning. In this curriculum, you will learn about what is sometimes called classic machine learning, using primarily Scikit-learn as a library and avoiding deep learning, which is covered in our forthcoming ‘AI for Beginners’ curriculum.

Artificial Intelligence for Beginners – A Curriculum
Azure Cloud Advocates at Microsoft are pleased to offer a 12-week, 24-lesson curriculum all about Artificial Intelligence.

Interesting Stuff

Microsoft’s Satya Nadella Is Betting Everything on AI
Satya Nadella, the CEO of Microsoft, is betting everything on AI and has made a series of bold moves to transform the company and the industry. He partnered with OpenAI, a nonprofit that develops cutting-edge AI models, and integrated generative AI into Bing and other products, such as Copilot, a tool that automates coding. He also acquired GitHub, LinkedIn, and Minecraft, and invested $10 billion in OpenAI, making it a major shareholder.

Have an awesome week!