Azure Every Day

Testing Row-Level Security in the Power BI Service

In my last post, I showed how to test row-level security in Power BI Desktop. You can check out that post here. In this follow up, you’ll learn how to test row-level security in the Power BI Service once you’ve published your file there. Also, please check out my demo at the end of this post.

  • After navigating to the workspace where I published my file, I need to find the data set and select Security from the option menu, as row-level security resides at the data set level.
  • On the next screen, you should see the role that was created in Power BI Desktop, along with a field to enter your role membership, in other words, people or groups who belong to this role.
  • You will see the functionality to test row-level security only when you hover the mouse over the role and get the ellipsis. Click on this ellipsis and you’ll get the option that says, ‘test as role’.
  • This will take us to the report page where it appears that row-level security is immediately applied. I know this because I see that Power BI is taking my login credentials and applying it to the report which matches what I saw when I was testing this in Power BI Desktop.
  • If I want to change the user that I’m testing, I can enter a new email by clicking on the menu at the top.
  • One important thing to clarify:
    • Once a PBI file is published to the service, there are other considerations that can impact if row-level security works as expected, such as workspace membership roles, data set permissions and data set security role memberships.
    • For example, if we look at our workspace membership, we see that we have 3 report users with 3 different roles. Row-level security only applies to users in a viewer role, not to members, admins, or contributors.
    • In my example, I am admin of the report, so if I return to the report page and type in my email, I have access everything and the actual result is returned. This also applies to contributors and if I type in their email, they will get the same result with access to all the data, just as I did as an admin.
    • If I type in the email of a viewer, I expect to see his information filtered on the page, but instead a get a bunch of errors. This is because row-level security applies to viewers since they do not have a role membership, so Power BI won’t let them see the data.
    • To give a viewer access, you’ll need to go back to the row-level security page and with the role selected, we can enter the viewer’s email in the Members field, then click add and save.
    • A best practice here is to only use security groups when you’re creating role membership; it’s easier than explicitly entering names and is much easier to maintain.
    • To test if this worked, I simply click on test this role, go back to my report page and enter the name of the viewer whom I just added, and the page should filter to what the view can now see.
  • What happens when role testing doesn’t work correctly? Check out my graphic below which gives 3 examples of errors returned and the reasons why role testing may not work.

I hope this post on testing row-level security in the Power BI Service was helpful. If you have questions or want to know more about row-level security in Power BI, about Power BI in general or any Azure product or service, let us know – our team of experts are here to help.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Steve WiseTesting Row-Level Security in the Power BI Service
Read More

How to Upload and Query a CSV File in Databricks

Welcome to another post in our Azure Every Day mini-series covering Databricks. Are you just starting out with Databricks and need to learn how to upload a CSV? In this post I’ll show you how to upload and query a file in Databricks. For a more detailed, step by step view, check out my video at the end of the post. Let’s get started!

Andie LetourneauHow to Upload and Query a CSV File in Databricks
Read More

How to Merge Data Using Change Data Capture in Databricks

My post today in our Azure Every Day Databricks mini-series is about Databricks Change Data Capture (CDC). A common use case for Change Data Capture is for customers looking to perform CDC from one or many sources into a set of Databricks Delta tables. The goal here is to merge these changes into Databricks Delta.

Jon BloomHow to Merge Data Using Change Data Capture in Databricks
Read More

Databricks and Azure Key Vault

In our ongoing Azure Databricks series within Azure Every Day, I’d like to discuss connecting Databricks to Azure Key Vault. If you’re unfamiliar, Azure Key Vault allows you to maintain and manage secrets, keys, and certificates, as well as sensitive information, which are stored within the Azure infrastructure.

Jon BloomDatabricks and Azure Key Vault
Read More

Custom Libraries in Databricks

This week’s Databricks post in our mini-series is focused on adding custom code libraries in Databricks. Databricks comes with many curated libraries that they have added into the runtime, so you don’t have to pull them in. There are installed libraries in Python, R, Java, and Scala which you can get in the release notes in the System Environment section of Databricks.

Jeff BurnsCustom Libraries in Databricks
Read More

How to Integrate Azure DevOps within Azure Databricks

In this post in our Databricks mini-series, I’d like to talk about integrating Azure DevOps within Azure Databricks. Databricks connects easily with DevOps and requires two primary things. First is a Git, which is how we store our notebooks so we can look back and see how things have changed. The next important feature is the DevOps pipeline. The pipeline allows you to deploy notebooks to different environments.

Jon BloomHow to Integrate Azure DevOps within Azure Databricks
Read More

How to Create an Azure Key Vault in Databricks

Welcome to another edition of our Azure Every Day mini-series on Databricks. In this post, I’ll walk you through creating a key vault and setting it up to work with Databricks. I’ve created a video demo where I will show you how to: set up a Key Vault, create a notebook, connect to a database, and run a query.

Leslie AndrewsHow to Create an Azure Key Vault in Databricks
Read More