Azure Every Day

An Intro to Azure Purview

Have you heard about one of the newest additions to Azure services, Azure Purview? This new addition to Azure services is currently in Preview and is a unified data governance tool that lets you easily create a map of your data landscape. I’m here to tell you more about Purview and share one of my favorite features.

In my video included in this post, I’ll walk you through a demo of how to utilize Azure Purview.

  • I’ll start with a view of a fictional data map. On that map, I have data centers and can group my data centers together.
  • I can also add many different data sources such as SQL Servers, Teradata, Hive Metastore, SAP, Azure Data Lake Storage Gen2, Azure SQL Databases, Power BI, as well as others, and the list is growing every day.
  • It allows for classification of sensitive data and there are built in systems classifications, as well as create custom classifications. So, if you store bank routing numbers, for example, Purview already knows what that is and how it should be formatted. You can then apply this classification to that field in the data map.
  • An example for a custom classification might be in a Police Department, where you may want to have a classification for the computer-aided dispatch and case numbers. And almost every organization has a custom employee ID. You can create those classifications and find them in multiple systems by using the automated data discovery process.
  • Purview creates a catalog that is easily searchable by your business users. They simple enter in a term and the interface will show matches and suggestions for different assets and places they can find them.
  • In my opinion, the coolest feature is the data lineage. Here’s how it works:
    • In my demo I have an example of a Power BI Campaign Analytics dashboard. It we start at the dashboard and work our way back, we’ll see there are two visualizations, campaign revenue and digital campaigns.
    • We can see that those come from a dataset that is embedded in the Power BI workspace. It also shows that both datasets come from some final data that had a prep and transform associated to it, along with showing the five different data sources.
    • The benefit here is being able to show our users where the data comes from, what happens to it, where it ends up and how it gets used can help them to understand the complexity of the data.

I am a strong proponent of data governance, but few organizations do it as it seems like a giant undertaking. I feel it’s one of the most foundational activities an organization can take to increase their data literacy and IQ. That’s why Purview is so exciting! It allows businesses to easily start cataloging their data and will show immediate value to the stakeholders in the organization.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].


Leslie AndrewsAn Intro to Azure Purview
Read More

Daylight Savings Time Changes in Power Query

In this post, I’ll show how to accommodate for Daylight Savings Time changes in a Power BI file. The video I’ve included will show the formulas that I use and how they adapted from Daylight Savings Time to Standard Time. There are many ways to do this and it is best to find a way that accommodates what you need to do, but I’ll walk through how I do it for my needs within Power Query.

  • The first thing I do in Power Query is to identify the start and end date of Daylight Savings Time. In the US, the start date is the second Sunday in March and end date was the first Sunday in November and I’ve added the time of 2:00am.
  • You could look these up and make a list of them for the next 8 or 10 years but to do it formulaically, I use a DateTime.LocalNow and Date.Year to pull the year out of whatever year we are in.
  • In the Applied Steps area in my query settings, I create a small temporary calendar table using that the year that will start in March and end in November. I create a list of those dates under source and then I convert that list into the actual dates.
  • No I have a calendar table with column1 being all dates listed from the start of March thru November. Using the Added Custom fields in Applied Steps, I add a column for month number and one for number of the day of the week (0-6).
  • Since I’m only interested in Sundays in March and November, I filter this table to show me only Sundays (day of week 0) in those months. More specifically, I need to know the second Sunday in March and the first Sunday in November. To do that, I want to group rows by their month number and then add an index column.
  • If I click on one of the columns in my table, I can see a Sunday index which is counting in order (1st, 2nd, 3rd Sunday). If I remove the other columns and just keep the one with the index and expand it, I see I have all the information pieces I need (month number, Sunday index columns).
  • Next, I’ll add a little helper column (GetDays) that will combine the month number and Sunday index columns together. I filter that helper column for the 2nd Sunday in March and the 1st Sunday in November.
  • So, I have all I need but I just need to add a time, so I’ll put a time component in and combine it with the date, so I have a DateTime column.
  • It’s important to note that when we do comparisons in Power Query between dates, date times or date time zone, you want to be consistent with your format. If I’m going to compare date time zones, I need to put my date times into a date time zone. I also need to ensure that the time zone is the same, so I’m going to use a consistent switch. I’ll add a time zone to my start and end dates.
  • I’ve added two queries (StartTime and EndTime) in which I’ll reference that DateTime list. In my formulas, I’ll write in the DateTime in March as the min and the end DateTime in November as the max. I’ll also need to add the time zone to each and make sure I’m consistent. Check out my video for more detail of the formulas.
  • The logic here is an If/Then statement. If the switch time zone which I’ve set to -4 (which is Eastern Time during Daylight Savings), is greater than the start time and less than the end time, then we are in Daylight Savings and the time is correct.
  • If it’s not in between those two dates, then we’re not in Daylight Savings and we need to subtract an hour from it because now we’re going to fall back. In other words, we’re going to be 5 hours behind instead of four. If I refresh it again, it should match the time that’s on my computer. In my SwitchTime Zone column, I see the time that it was in the summer and in my DateTime with DST Correction column I see the correct time now which matches my computer, so I know that this is working correctly.
  • But how do I know it’s working to the exact second? I’ll just duplicate my three queries and create a testing query. In this testing query I will arbitrarily set the start and end time to be a specific day and time to check back on my table to see if it matches. This testing helps ensure that it’s working perfectly.

My video below goes into more detail and may make this written demo clearer. I hope my peek into how I can accommodate for Daylight Savings within Power Query is helpful. This is certainly not the only way to do this, but it works well for me, so I thought it was worth sharing.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected]

Steve WiseDaylight Savings Time Changes in Power Query
Read More

Azure Data Factory 101

Are you just starting out with Azure Data Factory? In this post, I’ll give you an introduction to Azure Data Factory, covering what it is, how it works and how to set it up. Within the video included this post is a short demo of how to create and access the platform.

What is Data Factory?

  • Here is a clear definition that I found from the Cloud Academy. Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows in the cloud for orchestrating and automating data movement and data transformation.

How Data Factory works?

  • The key components of Data Factory are pipelines, activities, and datasets.
  • Pipelines are made up of activities. There are 3 types of activities:
    • Movement – the copy activity
    • Transformation – including Azure Functions, HD Insight, Stored Procedures, Spark and Databricks
    • Control – ForEachLoops, If Condition Branching, Wait Times, and Validations
  • Datasets represent the inputs and the outputs of the activities.
  • Linked Services – these are the connection strings and authentication for all types of sources for the data sets.
  • Data Flows – are the results of the datasets where you can apply logic and transform the data.
    • Mapping Data Flows are graphical with drag and drop functionality.
    • Wrangling Data Flows are more like using Power Query or M.
  • Integration Runtime – allows you to do data integration across different network environments. There are three types of runtimes: Azure, Self-hosted, and Azure SSIS. Depending on where the data is that you need to copy will determine which of these is appropriate for the use case.

In the video below, I provide a brief walk through of how to access and create in Azure Data Factory. Please check it out, as I think it is a good resource for those just starting out.

Our Azure Every Day series is another great resource. 3Cloud consultants posts weekly blogs and videos around all things Azure. You’ll find all current and past blogs on our website or by clicking here.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Leslie AndrewsAzure Data Factory 101
Read More

A Few Tips When Creating Power BI Mobile Reports

Power BI is the best BI and reporting tool out there, and so is Power BI Mobile. Microsoft has made mobile access a first-class experience in the Power BI platform. All the heavy lifting like authentication, data security and network connectivity is all automatically delivered out of the box.

Just download the app and sign in once, then sit back and enjoy access to all your reports and dashboards. You’ll also get integration with your mobile device’s virtual assistant, so you can ask Siri for the sales report and she’ll pull it up for you to view.

Awesome stuff, right? But as a developer, you may not know how to deliver an optimized view of your analysis for mobile consumption. Here, I’ll walk you through creating a mobile version of your report. I’ll point out the major features, as well as some valuable tips and tricks.

  • In the Power BI Desktop view, you can look at the design for the mobile view by clicking on View and then Mobile Layout. This gives me a blank canvas and on the right side is a visualizations pallet, which at this point only lists the visualizations from my desktop.
  • I can pull over one of my desktop visuals and add my report title and company logo, but honestly, it’s not that great. I want to change the way it looks or even create a totally different visual just for my mobile layout, but unfortunately, I’m stuck with the visuals that I have on my desktop.
  • To do this, here’s my first big tip for a work around; you’ll need to create visuals, put them on your desktop version, and then hide them. Let’s walk through this:
    • The title bar ‘Sales Analysis’ on my mobile view is huge so I want to make it smaller. I go back to my desktop and make a copy of my title bar and make one with a smaller font size.
    • Next, I go over to my mobile layout, get rid of the large one and add the smaller one.
    • Here’s the problem, when I go back to the desktop, I want to get rid of that smaller font title text box and when I delete it, it will disappear from the mobile view as well.
    • If I try to hide it in the desktop view, it will also hide it in my mobile view.
    • So, my trick is to hide that text box with the smaller font behind a larger text box in my desktop view. All I need to do is lineup that mobile only text box on top of the other one, go to Format and click Send to the Back. You can also do this in the Selection panel on the right side and drag the text box to the bottom of the list, and it will put it behind everything else higher on the list.
  • Next, I want to create a visual that is more meaningful in my mobile view. In my demo, I create a copy of my bar chart visual and I filter it to show only the most recent 3 months of my data.
  • I’ll shrink that down a bit and hide it behind the larger bar chart on my desktop. That new 3-month bar chart will appear in my mobile visualizations pallet and I can pull that over onto the canvas.
  • When I save this and publish it, I’ve created one version that users who view it from the desktop will get the desktop view. Users that view it from their phone will see the desktop view if they are in horizontal mode but will get the mobile view if they are in vertical mode.

When I’m on my phone and open the Power BI app, I see two different icons. The mobile version I created and published has a different icon with a phone on it. This way end users can quickly identify the version that is intended for the mobile use. I can rotate my phone horizontally and see the desktop version, but the goal of creating a mobile view of reports is to just have distinct data (in my case, the most recent 3 months) available for a quick update.

I can also add a quick Siri shortcut by simply telling Siri that when I say, ‘open mobile sales report’, my phone will go right to that mobile report. These tips should be super helpful when creating Power BI mobile reports.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Chris SilvernailA Few Tips When Creating Power BI Mobile Reports
Read More

Avoiding Issues When Generating Time/Date in Power BI

Have you ever run into hurdles when trying to generate time and date in Power BI? I know I have, so I’m here to help and to demonstrate the differences between generating time and date in Power BI Desktop vs in the Power BI Service.

A few key points before I dig in:

When we generate time and date in Power BI Desktop using the native Power BI tools, it pulls that information from our local computer settings.

In the Power BI Service, the Service generally exists in a different server time and date than your local machine. Therefore, there will be some differences between the times as they appear in the desktop file and how they appear in the Service.

Currently, there is no native functionality to handle Daylight Savings Time changes. Keep this in mind when you start to generate your own time and dates – you’ll need to create tools that take this into account.

What I’ve done will be much easier to see by watching my video instead of trying to explain in text, as it could be confusing. Here’s a brief overview of my video.

  • In my video, you’ll see that I went into Power Query and copied the different queries that I used to generate time and date.
  • The first one is generating time and date in Power Query using DateTImeZone.LocalNow to pull the local time. Remember, when operating in the PBI Desktop, this is pulling information from my local machine.
  • What I’m going to do is reuse some of the formulas and look at how we can switch the time and what those changes look like as I go through the queries I’ve highlighted.
  • In my demo, you’ll see that rather than publishing this file and jumping in the Power BI Service to look at it, I’ve taken the queries that I’ve used to generate my times, put them in the Service as a data flow, and then imported that dataflow into a Power BI Desktop file.
  • So, I have 2 pages; one shows the information from the data flow and the other is the information generated locally. This way I can show side by side what the changes are going to look like.
  • I’ll walk you through the steps of these scenarios and the queries I used:
    • Manually switching time using #duration
    • Modifying date/time to reflect Eastern Standard Time (-4 UTC) using SwitchZone
    • Modifying time/date in Power Query using SwitchZone and FixedLocalNow (time at start of query execution)
    • Adjusting time -4 or -5 based on daylight saving calendar

In summary, this post covers the changes to date/time as we move from Power BI Desktop to the Service and what are the best formulas to use when we are generating our own time. You’ll see that the key takeaway is the best way is to use SwitchZone when adjusting between the Desktop and the Service.

I think you’ll find this helpful when running into date/time hurdles in Power BI.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Steve WiseAvoiding Issues When Generating Time/Date in Power BI
Read More

What is Delta Lake in Databricks?

If you’re not familiar with Delta Lake in Databricks, I’ll cover what you need to know here. Delta Lake is a technology that was developed by the same developers as Apache Spark. It’s designed to bring reliability to your data lakes and provided ACID transactions, scalable metadata handling and unifies streaming and batch data processing.

Let’s begin with some of the challenges of data lakes:

  • Data lakes are notoriously messy as everything gets dumped there. Sometimes, we may not have a rhyme or reason for dumping data there; we may be thinking we’ll need it at some later date.
  • Much of this mess is because your data lake will have a lot of small files and different data types. Because there are many small files that are not compacted, trying to read them in any shape or form is difficult, if not impossible.
  • Data lakes often contain bad data or corrupted data files so you can’t analyze them unless you go back and pretty much start over again.

This is where Delta Lake comes to the rescue! It delivers an open-source storage layer that brings ACID transactions to Apache Spark big data workloads. So, instead of the mess I described above, you have an over layer of your data lake from Delta Lake. Delta Lake provides ACID transactions through a log that is associated with each Delta table created in your data lake. This log records the history of everything that was ever done to that data table or data set, therefore you gain high levels of reliability and stability to your data lake.

Key Features of Delta Lake are:

  • ACID Transactions (Atomicity, Consistency, Isolation, Durability) – With Delta you don’t need to write any code – it’s automatic that transactions are written to the log. This transaction log is the key, and it represents a single source of truth.
  • Scalable Metadata Handling – Handles terabytes or even petabytes of data with ease. Metadata is stored just like data and you can display it using a feature of the syntax called Describe Detail which will describe the detail of all the metadata that is associated with the table. Puts the full force of Spark against your metadata.
  • Unified Batch & Streaming – No longer a need to have separate architectures for reading a stream of data versus a batch of data, so it overcomes limitations of streaming and batch systems. Delta Lake Table is a batch and streaming source and sink. You can do concurrent streaming or batch writes to your table and it all gets logged, so it’s safe and sound in your Delta table.
  • Schema Enforcement – this is what makes Delta strong in this space as it enforces your schemas. If you put a schema on a Delta table and you try to write data to that table that is not conformant with the schema, it will give you an error and not allow you to write that, preventing you from bad writes. The enforcement methodology reads the schema as part of the metadata; it looks at every column, data type, etc. and ensures what you’re writing to the Delta table is the same as what the schema represents of your Delta table – no need to worry about writing bad data to your table.
  • Time Travel (Data Versioning) – you can query an older snapshot of your data, provide data versioning, and roll back or audit data.
  • Upserts and Deletes – these operations are typically hard to do without something like Delta. Delta allows you to do upserts or merges very easily. Merges are like SQL merges into your Delta table and you can merge data from another data frame into your table and do updates, inserts, and deletes. You can also do a regular update or delete of data with a predicate on a table – something that was almost unheard of before Delta.
  • 100% Compatible with Apache Spark

Delta Lake is really a game changer and I hope you educate yourself more and start using it in your organization. You’ll find a great training resource from the Databricks community at:

Or reach out to us at 3Cloud. Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].


Brian CusterWhat is Delta Lake in Databricks?
Read More

Azure Synapse Analytics Now in GA and the Public Preview of Azure Purview

I’m here with some exiting news from Microsoft! Last week at a digital conference, Satya Nadella announced the general availability of Azure Synapse Analytics and the preview of Azure Purview, a unified data governance service. Azure Synapse Analytics has been gaining traction while in preview and adding Azure Purview gives businesses the ability to get the most of out their data and analytics.

Let’s talk about Azure Purview. This is a comprehensive data governance service that helps organizations discover all data across the organization. Demos at the digital conference showcased different ways you can use Purview for governance. Some key things are the ability to go multi-cloud, not only in Azure, but others as well. You can also connect with your on-prem environment and your Azure data assets.

For quite some time, those of us in the data disciplines have worked to inventory all the different aspects of data, like column, database and table names, etc., and put all those pieces into a common repository, often referred to as a data dictionary. Microsoft has been working for years to create a product that would be comprehensive enough to help most people with their governance and compliance needs. We’ve now got this with Azure Purview.

Some key highlights pointed out are:

  • A business glossary – no need to manually build a data dictionary.
  • Automated data classification – allows you to know things like data type (Social Security number for instance). You also have custom options and can schedule future scanning and classification on a routine basis. This way you’re getting continual updates, as opposed to a data dictionary where you get snapshot in time unless you manually update.
  • Cloud-based search facility – gives you the ability to find things quickly and easily across a broad series of data assets.
  • Data lineage and reporting – supports the end to end data lifecycle.
  • Power BI facilities

I feel Azure Purview is a very strong offering. Without it I would have either create my own versions of these pieces or using something like Embarcadero, which I used years ago. Another thing to note is that the experience is very similar to the canvas workspace experience in Azure Synapse Analytics, so if you’ve been working with that, it will feel very familiar.

The next part of Microsoft’s announcement is that Azure Synapse Analytics is now generally available. Azure Synapse Analytics is a limitless analytics service which brings together traditional data warehouse and big data analytics in one offering. It brings these together for a unified experience to ingest, prepare, manage, and serve data for immediate machine learning and BI applications. I, and many of our customers, have been using this great product a lot, so this going GA is surely exciting news.

Some noteworthy things with Azure Synapse Analytics are:

  • A new native cloud distributed SQL engine
  • Deep integration with Spark
  • Flexible query options such as serverless and dedicated
  • Integration with Power BI and machine learning
  • TPC-H benchmark at petabyte scale
  • Native Row Level Security (this is not possible with Amazon Redshift or Google BigQuery)
  • Native ML integration for the citizen data scientist
  • Code management – by that their talking about Azure DevOps as another piece that plays well with it.
  • Power BI integration to Teams which I found to be kind of cool

Again, great announcements with both the general availability of Azure Synapse Analytics and the public preview of Azure Purview. These two products combined empowers teams to remove data silos and leverage all data for analytics and data governance.

Need further help with these or any Azure product or service? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Rowland GoslingAzure Synapse Analytics Now in GA and the Public Preview of Azure Purview
Read More

Setting Variables Using DAX in Power BI

Welcome to another Azure Every Day focused on one of my favorite topics, using DAX in Power BI. In a previous post, I covered a couple of my favorite DAX formulas, CALCULATE and FILTER. Here, I’ll discuss variables and how to work those into CALCULATE/FILTER to expand on that and make it even more powerful. You may want to check out my previous post here, as I’ll be continuing using that demo.

  • I started by building in a CALCULATE/FILTER function in a table to calculate the beginning balance for 2017 for all my assets.
  • My code (see my video demo for code detail) tells it to calculate the sum of the beginning balance and I filtered the table where fiscal year equals 2017, and finance type equals assets.
  • Now, let’s say we want to know the assets for every year, not just 2017. To do this, we need to set the year into a variable and then it will calculate that asset for each individual year.
  • You use the VAR function to set the variable in the code and you’ll need to give it a name. In my case I’ll use YR for year and I’ll have that EQUAL to the fiscal year. It’s important to note that anytime you set a variable, you must hit return at the end.
  • Next, I’m going to update my FILTER. My code is calculating the sum of the beginning balance and I’m filtering the table where fiscal year equals 2017, but now I want to take that out and change it to fiscal year equals year.
  • How this works is when this goes through each row it will calculate for each year by using the variable instead of hard coding the year into there.
  • So previously we only had one outcome for 2017, now when we submit this, we’ll see four outcomes as we have four years’ worth of data, so we’re getting a calculation for our beginning balance for each year.
  • We can even step this up a bit if we wanted not only the beginning balance for each year, but also for each finance type. Maybe we don’t just want the assets but other values like equity, expense or liability.
  • All we need to do is set another variable that I’ll call FT for finance type. And instead of doing this for where finance type equals asset, we’ll say where finance type equals our variable.
  • Now we’ll have the calculation for every year for every individual finance type.


I hope this quick example helps you to start using these DAX formulas in your reports. The CALCULATE/FILTER functions and using variables are something I use all the time in Power BI.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Alex BeechSetting Variables Using DAX in Power BI
Read More

DAX CALCULATE and FILTER Functions in Power BI

We all know Power BI is a powerful analytics tool and it’s easy to create a new Power BI Desktop file, import some data into it and create reports that show valuable insight. Adding Data Analysis Expression, better known as DAX, enables you to get the most out of your data.
If you’re new to Power BI or DAX, DAX is a collection of formulas, or expressions, that are used to calculate and return one or more values. DAX helps you to use data already in your model and create new information from it that can solve real business problems.
When I first started using DAX functions, it brought my Power BI skills to the next level. I was able to tackle some analytical needs that I had struggled with in the past. I’m here to share a couple favorite formulas that I use all the time called the CALCULATE function and the FILTER function. Please be sure to watch my video included in this post as I walk through using this DAX formula.
• In my demo, I’m working with a data set to find the beginning balance for 2017 for our assets.
• To do that I need to sum a column in my table called beginning balance when fiscal year equals 2017 and when financial type equals asset.
• I’ll do this by using a combination of the CALCULATE function and the FILTER function. The CALCULATE function allows you to calculate a function on the entire table.
• In my code I’m going to CALCULATE the sum on our beginning balance. This would calculate the sum for the entire table.
• But we only want to calculate the sum for 2017 for just the assets and financial type. For this, once we have calculated the table, we need filter that table. Think of this FILTER function as making a digital table in the background.
• We need to FILTER it where fiscal year equals 2017 and where finance type equals asset. In my code, I’ll add FILTER for the function, and we need to tell it what table we are going to be filtering, in my case it’s the balance table. Then add where fiscal year equals 2017 and where finance type equals asset.
• Using these DAX functions, our result will show the beginning balance for our assets for 2017.
• My video shows you exactly how to write the code I used here, so be sure to check it out.

As you can see, this is super simple, and this formula allows you flexibility in how you write it. You can FILTER tables in many ways and use different functions within CALULATE. I hope you enjoyed this simple use case of these powerful DAX functions in Power BI.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or [email protected]

Alex BeechDAX CALCULATE and FILTER Functions in Power BI
Read More

Create a Custom Visual in Power BI with React

Welcome to another edition of Azure Every Day! I’m an App Development Consultant at 3Cloud and in this post I’ll show you how to create a custom visual in Power BI with React.
When creating a custom visual in Power BI, you use the TypeScript language. You’ll be fine with TypeScript if you are familiar with JavaScript. Once you’ve created a custom visual from scratch, you can test and debug your visuals online using live reports, as well as package and deploy your visual to a production report. You can check out the Microsoft documentation about custom visuals at to learn more.
I’ll walk you through the process here, but also be sure to watch my video included at the end of this post.
• To get started, be sure you have node installed on your machine.
• Navigate to your project directory run the npm install command for Power BI visual tools. Once Power BI visual tools are installed, run pbiviz, new and then your project name.
• Next, run pbiviz start which will start your custom visualization in the local host.
• To debug your visualization, go to and create a new report (or use an existing one) and add the developer visual to the report.
• In order to add React elements to your custom visual, you’ll need to install React and ReactDOM. Import React and ReactDOM into the visual.ts file or whatever file you will be using to render the HTML elements to the DOM.
• Create a React element within the visual.ts. Also create the DOM element and add an element to the DOM that is referenced by this element.
• Next, create the React component using React.createElement of the component name that you’ve imported, then pass any props you would like in the second parameter of the React.createElement function.
• Finally, you can add the React element to the DOM by using ReactDOM.render and reference the React component that you built and the HTML element on the page that you would like to add the React element to.
• When you’re finished building and debugging your project, you can set the project and author details in the pbiviz.json file.
• Run the pbiviz package command to generate a pbiviz file to import into your Power BI report.
• Now you can go into your report and import a visual from file. Select the visual and add the data you would like to add to your visual and configure the settings you had previously.
• Please check out my video for code detail on all the above steps.


This post walked you through the steps to create a custom visual in Power BI using React. I hope you’ll give it a try.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or [email protected]

Tom WardCreate a Custom Visual in Power BI with React
Read More