TWIL: November 20, 2022
This week I focused on some Data Architecture topics such as Data Mesh, Data Sharing and Disaster Recovery. Also, some articles on Power BI performance and security as well as Databricks pricing optimization. I hope you find it useful.
Episode 1819: Hacking APIs with Dana Epp
Are your APIs vulnerable to hacking? Carl and Richard talk to Dana Epp about how APIs have become the focus of black hats today. Dana talks about tooling you can use to look at your APIs the same way the hackers do, and find potential exploit paths for impersonating users, stealing data, and otherwise exploiting your system. There’s an OWASP list specifically for API security – spend some time with it!
What is a data mesh?
Data mesh is an architectural pattern for implementing enterprise data platforms in large, complex organizations. It helps scale analytics adoption beyond a single platform and a single implementation team.
DR for Azure Data Platform
This series of articles provides an illustrative example of how an organization could design a disaster recovery (DR) strategy, describe the process to recover service for an enterprise Azure Data platform in the event of a disaster and test that DR process.
Lightweight Implementation of Self-Service Data Sharing Platform on Azure
In vast number of companies obtaining access to the dataset requires dozens of emails, meetings, and tons of ineffective communications. Bureaucracy, complexity of systems and inefficiency of processes hinder our ability to work effectively. The main factor affecting our ability to comply with the requirements of our businesses is the complexity and inefficiency of the legacy process of data governance.
Best practice rules to improve your model’s performance
There are a plethora of articles, blog posts, and videos which share recommendations for best practices for Power BI and tabular modeling. It is of course essential to read these in order to learn the proper development approach. In this post, we are sharing a set of rules which you can add to your instance of Tabular Editor. Within seconds it scans your entire model against each of the rules and provides a list of all the objects which satisfy the condition in each rule.
DirectQuery model guidance in Power BI Desktop
This article targets data modelers developing Power BI DirectQuery models, developed by using either Power BI Desktop or the Power BI service. It describes DirectQuery use cases, limitations, and guidance. Specifically, the guidance is designed to help you determine whether DirectQuery is the appropriate mode for your model, and to improve the performance of your reports based on DirectQuery models. This article applies to DirectQuery models hosted in the Power BI service or Power BI Report Server.
Row-level security (RLS) with Power BI
Row-level security (RLS) with Power BI can be used to restrict data access for given users. Filters restrict data access at the row level, and you can define filters within roles. In the Power BI service, members of a workspace have access to datasets in the workspace. RLS doesn’t restrict this data access.
Optimize Azure Databricks costs with a pre-purchase
You can save on your Azure Databricks unit (DBU) costs when you pre-purchase Azure Databricks commit units (DBCU) for one or three years. You can use the pre-purchased DBCUs at any time during the purchase term. Unlike VMs, the pre-purchased units don’t expire on an hourly basis and you use them at any time during the term of the purchase. Any Azure Databricks use deducts from the pre-purchased DBUs automatically. You don’t need to redeploy or assign a pre-purchased plan to your Azure Databricks workspaces for the DBU usage to get the pre-purchase discounts.
How Azure Databricks pre-purchase discount is applied
Databricks pre-purchase applies to all Databricks workloads and tiers. You can think of the pre-purchase as a pool of pre-paid Databricks commit units. Usage is deducted from the pool, regardless of the workload or tier.
Azure Databricks Pricing
No up-front costs. Only pay for the compute resources you use at per second granularity with simple pay-as-you-go pricing or committed-use discounts.
Microsoft Bot Framework
Moving from LUIS to CLU in Bot Composer Projects
As LUIS Language Understanding enters end-of-life, I’ve been getting some requests for guidance on how to smoothly migrate to the new CLU (Conversational Language Understanding), the spiritual successor to LUIS and part of the new Azure Cognitive Service for Language product. While this might seem daunting at first, it doesn’t have to be a major headache if you know a few tips and tricks to make this migration smoother.
Have an awesome week!