TWIL: June 25, 2023

Last week there was no TWIL, so I’m reporting two weeks of learning today. I want to recommend two episodes of Lex Fridman’s podcast but, if you only have time for one, listen to the conversation with Matthew McConaughey because it’s amazing. I’m also highlighting the new Power BI Desktop developer mode, which is awesome, a playlist of introductory Microsoft Fabric videos, a set of articles on Azure OpenAI service including the new “chat with your own data” feature, the introduction to Azure AI Content Safety service and a set of interesting articles on development for LLMs. Have fun!


Podcasts

Lex Fridman Podcast

Episode 381 – Chris Lattner: Future of Programming and AI
Chris Lattner is a legendary software and hardware engineer, leading projects at Apple, Tesla, Google, SiFive, and Modular AI, including the development of Swift, LLVM, Clang, MLIR, CIRCT, TPUs, and Mojo. This great conversation spans multiple topics mostly around programming languages and AI.

Episode 384 – Matthew McConaughey: Freedom, Truth, Family, Hardship, and Love
Matthew McConaughey is an Oscar-winning actor and author of Greenlights. I can’t recommend this enough. A truly deep conversation with a very interesting and inspiring person. It goes through some hard topics but keeps an optimistic tone, and a lot of awesome details about some of the greatest movies Matthew has been a part of.


Power BI

Power BI Desktop projects
Power BI Desktop introduces a new way to author, collaborate, and save your projects. You can now save your work as a Power BI Project (PBIP). As a project, report and dataset artifact definitions are saved as individual plain text files in a simple, intuitive folder structure.

Power Query: Azure Cost Management
This article describes how to setup the Azure Cost Management connector for Power Query to retrieve cost data from Azure and present it in a Power BI report.


Microsoft Fabric

YouTube Playlist: Learn Microsoft Fabric
Microsoft Fabric is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place. With Fabric, you don’t need to piece together different services from multiple vendors. Instead, you can enjoy a highly integrated, end-to-end, and easy-to-use product that is designed to simplify your analytics needs.

Integrating Microsoft Fabric with Databricks
Microsoft unveiled Microsoft Fabric last week and one of the questions I’m getting is, how does Databricks integrate with Microsoft Fabric? So here’s my take on how these two platforms can be integrated to help you build a super platform that leverages the best aspects of both these platforms. Databricks brings the power of spark and photon to build efficient data pipelines and provide you with the ability to build complex AI/ML models, while Microsoft Fabric brings the ease of building BI Analytics based on cool AI based co-pilot.


Azure Open AI Service

Manage Azure OpenAI Service quota
Quota provides the flexibility to actively manage the allocation of rate limits across the deployments within your subscription. This article walks through the process of managing your Azure OpenAI quota.

Boost up 4x Request per minute for your AOAI Resources
Explanation on how to fully utilize the Azure Open AI resource quotas and limits, using an architecture with an Application Gateway and Azure Functions that route the requestes to multiple resource instances, hereby increasing the number of requests per second of your application.

Implement logging and monitoring for Azure OpenAI models
This solution provides comprehensive logging and monitoring and enhanced security for enterprise deployments of the Azure OpenAI Service API. The solution enables advanced logging capabilities for tracking API usage and performance and robust security measures to help protect sensitive data and help prevent malicious activity.

Introducing Azure OpenAI Service On Your Data in Public Preview
We are excited to announce the launch of Azure OpenAI Service on your data in public preview, a groundbreaking new feature that allows you to harness the power of OpenAI models, such as ChatGPT and GPT-4, with your own data. This new and highly requested customer capability revolutionizes the way you interact with and analyze your data, providing greater accuracy, speed, and valuable insights. Let’s explore the features, use cases, data sources, and next steps for leveraging Azure OpenAI Service on your data.

Azure OpenAI on your data (preview)
Azure OpenAI on your data enables you to run supported chat models such as ChatGPT and GPT-4 on your data without needing to train or fine-tune models. Running models on your data enables you to chat on top of, and analyze your data with greater accuracy and speed. By doing so, you can unlock valuable insights that can help you make better business decisions, identify trends and patterns, and optimize your operations. One of the key benefits of Azure OpenAI on your data is its ability to tailor the content of conversational AI.


Azure AI Content Safety

What is Azure AI Content Safety?
Azure AI Content Safety detects harmful user-generated and AI-generated content in applications and services. Content Safety includes text and image APIs that allow you to detect material that is harmful. We also have an interactive Content Safety Studio that allows you to view, explore and try out sample code for detecting harmful content across different modalities.


Development with LLMs

WhyLabs Language Toolkit
LangKit is an open-source text metrics toolkit for monitoring language models. It offers an array of methods for extracting relevant signals from the input and/or output text, which are compatible with the open-source data logging library whylogs.

Guardrails
Guardrails is a Python package that lets a user add structure, type and quality guarantees to the outputs of large language models (LLMs). Guardrails does pydantic-style validation of LLM outputs (including semantic validation such as checking for bias in generated text, checking for bugs in generated code, etc.), takes corrective actions (e.g. reasking LLM) when validation fails, and enforces structure and type guarantees (e.g. JSON).

Introducing LangChain toolkits for Azure Cognitive Services
LangChain offers a set of tools to integrate your agents with external services like Bing Search, local File Systems, YoutTube Search, and so on. Some integrations however, like CSV, Pandas Dataframe, and Gmail, need a particular set of tools to properly function. Those sets of tools are called toolkits, and today we are going to explore Azure Cognitive Services toolkit to extend LLMs with multimodal capabilities.

LangChain: Using Azure Cognitive Search as Vector Store
LangChain documentation on how to use Azure Cognitive Search as a Vector Store in Python.

Do Foundation Model Providers Comply with the EU AI Act?
In this post, we evaluate whether major foundation model providers currently comply with the draft requirements and find that they largely do not. Foundation model providers rarely disclose adequate information regarding the data, compute, and deployment of their models as well as the key characteristics of the models themselves. In particular, foundation model providers generally do not comply with draft requirements to describe the use of copyrighted training data, the hardware used and emissions produced in training, and how they evaluate and test models. As a result, we recommend that policymakers prioritize transparency, informed by the AI Act’s requirements.


Cool Stuff

ElevenLabs – Prime AI text to speech
We’re a voice technology research company, developing the most compelling AI speech software for publishers and creators. Our goal is to instantly convert spoken audio between languages.


Have a brilliant week!