TWIL: May 28, 2023
These past two weeks were crazy with new stuff to learn. Microsoft Build 2023 had more than 50 new announcements including a major new product: Microsoft Fabric! I’m also highlighting a very interesting 4-hour long episode from Lex Fridman’s podcast with Stephen Wolfram, a lot of news on Azure AI services, the announcement video for Power BI Copilot, a set of articles on developing application for Large Language Models, and much more. Enjoy!
Episode 376: Stephen Wolfram: ChatGPT and the Nature of Truth, Reality & Computation
Stephen Wolfram is a computer scientist, mathematician, theoretical physicist, and the founder of Wolfram Research, a company behind Wolfram|Alpha, Wolfram Language, and the Wolfram Physics and Metamathematics projects.
Microsoft Build 2023
Microsoft Build 2023: Book of News
The goal with the Book of News is to provide you with a roadmap to all the announcements we’re making, with all the details you need. Our focus remains the same – to make it as easy as possible for you to navigate the latest news and offer critical details on the topics you’re most interested in exploring.
Azure AI Studio: Satya Nadella at Microsoft Build 2023
At Microsoft Build, Satya Nadella introduced Azure AI Studio, a full life cycle tool to build, train, evaluate, and deploy the latest next-generation models responsibly.
Microsoft Fabric is an end-to-end analytics solution with full-service capabilities including data movement, data lakes, data engineering, data integration, data science, real-time analytics, and business intelligence—all backed by a shared platform providing robust data security, governance, and compliance. Your organization no longer needs to stitch together individual analytics services from multiple vendors. Instead, use a streamlined solution that’s easy to connect, onboard, and operate.
Introducing Microsoft Fabric: Data analytics for the era of AI
Today we are unveiling Microsoft Fabric—an end-to-end, unified analytics platform that brings together all the data and analytics tools that organizations need. Fabric integrates technologies like Azure Data Factory, Azure Synapse Analytics, and Power BI into a single unified product, empowering data and business professionals alike to unlock the potential of their data and lay the foundation for the era of AI.
Webinar Series: Introduction to Microsoft Fabric
Microsoft Fabric delivers an integrated and simplified experience for all analytics workloads and users on an enterprise-grade data foundation. Watch the series to learn about the key experiences and benefits of this end-to-end analytics solution.
Can cross-cloud data analytics be easy? | Microsoft Fabric
Work with your data in place, wherever it resides, with Microsoft Fabric, our next generation data analytics service powered by one of the first true multi-cloud data lakes, called OneLake. Go from raw data to meaningful insights over data spread across your organization and in other clouds in seconds—without moving it. Microsoft Fabric provides a single integrated service that includes data integration capabilities, data engineering for shaping your data, data warehousing, the ability to build data science models, real time analytics, and business intelligence. Data from these different experiences is brought together by one unified data lake for your organization, OneLake, and is accessible regardless of the engine used. Justyna Lucznik, Principal Group PM for Microsoft Fabric, joins Jeremy Chapman to share how to make your organizational data more accessible.
Microsoft outlines framework for building AI apps and copilots; expands AI plugin ecosystem
Microsoft introduced the concept of a copilot nearly two years ago with GitHub Copilot, an AI pair programmer that assists developers with writing code. This year, Microsoft rolled out copilot experiences across its core products and services, from the AI-powered chat in Bing that’s changing how people search the internet to Microsoft 365 Copilot, GitHub Copilot X, Dynamics 365 Copilot, Copilot in Microsoft Viva and Microsoft Security Copilot.
Harness the power of Large Language Models with Azure Machine Learning prompt flow
Prompt flow is a powerful feature within Azure Machine Learning (AzureML) that streamlines the development, evaluation, and continuous integration and deployment (CI/CD) of prompt engineering projects. It empowers data scientists and LLM application developers with an interactive experience that combines natural language prompts, templating language, a list of built-in tools and Python code.
Announcing Foundation Models in Azure Machine Learning
In Azure Machine Learning (AzureML), we are constantly looking for new ways to enable our customers to easily build, train and deploy ML models. Today, we are excited to announce the public preview of foundation models in Azure Machine Learning, which empowers users to discover, customize and operationalize large foundation models at scale through the model catalog. The model catalog is your starting point to explore collections of foundation models. It offers a collection of Open Source models curated by AzureML (described in detail in this blog post), a Hugging Face hub community partner collection, and a collection of Azure OpenAI Service models (available in Private Preview to AzureML Insiders). With this new capability, our customers can easily access the latest foundation models and accelerate the use of these models for fine-tuning, evaluation, deployment and operationalization in their own specific workloads.
Generative AI for Developers: Exploring New Tools and APIs in Azure OpenAI Service
At Microsoft Build 2023, we’re excited to unveil groundbreaking new features that will help you integrate your AI with your data and systems, allowing you to create never-before-seen innovations. You can now use your own data to run on these cutting-edge models, add plugins to simplify integrating external data sources with APIs, and reserve provision throughput to gain control over the configuration and performance of OpenAI’s large language models at scale. Plus, you can gain control over your quota and rate limits and create and configure content filters. Now let’s take a closer look at the announcements.
Synapse Espresso: Improve Copy Data performance with Copy Compute Scale!
Welcome to the 35th episode of our Synapse Espresso series! In this video, Stijn demonstrates the Compute Scale Copy feature, which is in Preview, to reduce queue times when executing copy data tasks in a managed VNET environment. He’ll show you how to set your Integration Runtime to use this feature and what configuration settings you have. In the end we show you what the impact is and how much of a performance gain you receive!
Azure Databricks: Disaster recovery
A clear disaster recovery pattern is critical for a cloud-native data analytics platform such as Azure Databricks. It’s critical that your data teams can use the Azure Databricks platform even in the rare case of a regional service-wide cloud-service provider outage, whether caused by a regional disaster like a hurricane or earthquake, or other source.
Azure Language Service
Azure Language Service: Model lifecycle
Language service features utilize AI models. We update the language service with new model versions to improve accuracy, support, and quality. As models become older, they are retired. Use this article for information on that process, and what you can expect for your applications.
Microsoft Power BI – AI Copilot Announcement
Video announcement of Copilot for Microsoft Power BI.
Large Language Models
Introducing Semantic Kernel: Building AI-based Apps
The future of AI is finally here, and it’s a gamechanger for software developers. Explore the possibilities of Semantic Kernel (SK), the new face of AI-powered development, packaged in a lightweight, easy-to-use, multilayered software development kit. Get up and running quickly with SK, the latest addition to the Microsoft AI ecosystem that enables developers to integrate LLM AI capabilities easily into their apps.
Introducing LangChain Agents
Agents can be seen as applications powered by LLMs and integrated with a set of tools like search engines, databases, websites, and so on. Within an agent, the LLM is the reasoning engine that, based on the user input, is able to plan and execute a set of actions that are needed to fulfill the request. The concept of reasoning and acting is also the basis of a new prompt engineering technique called ReAct (Reason and Act), introduced by Yao et al.
Building Interactive Enterprise Grade Applications with Open AI and Microsoft Azure
In this article we summarize our point of view and some early lessons learned when it comes to implementation of Azure Open AI backed applications. You will be guided through the main components that bring intelligent interactive applications into live and make them feedback-driven. While it is easier to visualize these considerations in the case of a conversational application, the details provided here are not limited to those scenarios and can be applied to other cases as well. This is a new field of course so please do not treat this article as a definite or prescriptive guidance – apply common sense where possible, experiment and test … ask Open AI if in doubt.
ChatGPT Plugin Quickstart using Python and FastAPI
This is a quickstart for sample for creating ChatGPT Plugin using GitHub Codespaces, VS Code, and Azure. The sample includes templates to deploy the plugin to Azure Container Apps using the Azure Developer CLI.
Anthropic’s new 100K context window model is insane!
Anthropic released a new LLM with a 100K token context window. In this video I’ll explain what this means and we look at a demo.
Have an awesome week!