Azure Every Day

Azure Data Factory 101

Are you just starting out with Azure Data Factory? In this post, I’ll give you an introduction to Azure Data Factory, covering what it is, how it works and how to set it up. Within the video included this post is a short demo of how to create and access the platform.

What is Data Factory?

  • Here is a clear definition that I found from the Cloud Academy. Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows in the cloud for orchestrating and automating data movement and data transformation.

How Data Factory works?

  • The key components of Data Factory are pipelines, activities, and datasets.
  • Pipelines are made up of activities. There are 3 types of activities:
    • Movement – the copy activity
    • Transformation – including Azure Functions, HD Insight, Stored Procedures, Spark and Databricks
    • Control – ForEachLoops, If Condition Branching, Wait Times, and Validations
  • Datasets represent the inputs and the outputs of the activities.
  • Linked Services – these are the connection strings and authentication for all types of sources for the data sets.
  • Data Flows – are the results of the datasets where you can apply logic and transform the data.
    • Mapping Data Flows are graphical with drag and drop functionality.
    • Wrangling Data Flows are more like using Power Query or M.
  • Integration Runtime – allows you to do data integration across different network environments. There are three types of runtimes: Azure, Self-hosted, and Azure SSIS. Depending on where the data is that you need to copy will determine which of these is appropriate for the use case.

In the video below, I provide a brief walk through of how to access and create in Azure Data Factory. Please check it out, as I think it is a good resource for those just starting out.

Our Azure Every Day series is another great resource. 3Cloud consultants posts weekly blogs and videos around all things Azure. You’ll find all current and past blogs on our website or by clicking here.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Leslie AndrewsAzure Data Factory 101
Read More

A Few Tips When Creating Power BI Mobile Reports

Power BI is the best BI and reporting tool out there, and so is Power BI Mobile. Microsoft has made mobile access a first-class experience in the Power BI platform. All the heavy lifting like authentication, data security and network connectivity is all automatically delivered out of the box.

Just download the app and sign in once, then sit back and enjoy access to all your reports and dashboards. You’ll also get integration with your mobile device’s virtual assistant, so you can ask Siri for the sales report and she’ll pull it up for you to view.

Awesome stuff, right? But as a developer, you may not know how to deliver an optimized view of your analysis for mobile consumption. Here, I’ll walk you through creating a mobile version of your report. I’ll point out the major features, as well as some valuable tips and tricks.

  • In the Power BI Desktop view, you can look at the design for the mobile view by clicking on View and then Mobile Layout. This gives me a blank canvas and on the right side is a visualizations pallet, which at this point only lists the visualizations from my desktop.
  • I can pull over one of my desktop visuals and add my report title and company logo, but honestly, it’s not that great. I want to change the way it looks or even create a totally different visual just for my mobile layout, but unfortunately, I’m stuck with the visuals that I have on my desktop.
  • To do this, here’s my first big tip for a work around; you’ll need to create visuals, put them on your desktop version, and then hide them. Let’s walk through this:
    • The title bar ‘Sales Analysis’ on my mobile view is huge so I want to make it smaller. I go back to my desktop and make a copy of my title bar and make one with a smaller font size.
    • Next, I go over to my mobile layout, get rid of the large one and add the smaller one.
    • Here’s the problem, when I go back to the desktop, I want to get rid of that smaller font title text box and when I delete it, it will disappear from the mobile view as well.
    • If I try to hide it in the desktop view, it will also hide it in my mobile view.
    • So, my trick is to hide that text box with the smaller font behind a larger text box in my desktop view. All I need to do is lineup that mobile only text box on top of the other one, go to Format and click Send to the Back. You can also do this in the Selection panel on the right side and drag the text box to the bottom of the list, and it will put it behind everything else higher on the list.
  • Next, I want to create a visual that is more meaningful in my mobile view. In my demo, I create a copy of my bar chart visual and I filter it to show only the most recent 3 months of my data.
  • I’ll shrink that down a bit and hide it behind the larger bar chart on my desktop. That new 3-month bar chart will appear in my mobile visualizations pallet and I can pull that over onto the canvas.
  • When I save this and publish it, I’ve created one version that users who view it from the desktop will get the desktop view. Users that view it from their phone will see the desktop view if they are in horizontal mode but will get the mobile view if they are in vertical mode.

When I’m on my phone and open the Power BI app, I see two different icons. The mobile version I created and published has a different icon with a phone on it. This way end users can quickly identify the version that is intended for the mobile use. I can rotate my phone horizontally and see the desktop version, but the goal of creating a mobile view of reports is to just have distinct data (in my case, the most recent 3 months) available for a quick update.

I can also add a quick Siri shortcut by simply telling Siri that when I say, ‘open mobile sales report’, my phone will go right to that mobile report. These tips should be super helpful when creating Power BI mobile reports.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Chris SilvernailA Few Tips When Creating Power BI Mobile Reports
Read More

Avoiding Issues When Generating Time/Date in Power BI

Have you ever run into hurdles when trying to generate time and date in Power BI? I know I have, so I’m here to help and to demonstrate the differences between generating time and date in Power BI Desktop vs in the Power BI Service.

A few key points before I dig in:

When we generate time and date in Power BI Desktop using the native Power BI tools, it pulls that information from our local computer settings.

In the Power BI Service, the Service generally exists in a different server time and date than your local machine. Therefore, there will be some differences between the times as they appear in the desktop file and how they appear in the Service.

Currently, there is no native functionality to handle Daylight Savings Time changes. Keep this in mind when you start to generate your own time and dates – you’ll need to create tools that take this into account.

What I’ve done will be much easier to see by watching my video instead of trying to explain in text, as it could be confusing. Here’s a brief overview of my video.

  • In my video, you’ll see that I went into Power Query and copied the different queries that I used to generate time and date.
  • The first one is generating time and date in Power Query using DateTImeZone.LocalNow to pull the local time. Remember, when operating in the PBI Desktop, this is pulling information from my local machine.
  • What I’m going to do is reuse some of the formulas and look at how we can switch the time and what those changes look like as I go through the queries I’ve highlighted.
  • In my demo, you’ll see that rather than publishing this file and jumping in the Power BI Service to look at it, I’ve taken the queries that I’ve used to generate my times, put them in the Service as a data flow, and then imported that dataflow into a Power BI Desktop file.
  • So, I have 2 pages; one shows the information from the data flow and the other is the information generated locally. This way I can show side by side what the changes are going to look like.
  • I’ll walk you through the steps of these scenarios and the queries I used:
    • Manually switching time using #duration
    • Modifying date/time to reflect Eastern Standard Time (-4 UTC) using SwitchZone
    • Modifying time/date in Power Query using SwitchZone and FixedLocalNow (time at start of query execution)
    • Adjusting time -4 or -5 based on daylight saving calendar

In summary, this post covers the changes to date/time as we move from Power BI Desktop to the Service and what are the best formulas to use when we are generating our own time. You’ll see that the key takeaway is the best way is to use SwitchZone when adjusting between the Desktop and the Service.

I think you’ll find this helpful when running into date/time hurdles in Power BI.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Steve WiseAvoiding Issues When Generating Time/Date in Power BI
Read More

What is Delta Lake in Databricks?

If you’re not familiar with Delta Lake in Databricks, I’ll cover what you need to know here. Delta Lake is a technology that was developed by the same developers as Apache Spark. It’s designed to bring reliability to your data lakes and provided ACID transactions, scalable metadata handling and unifies streaming and batch data processing.

Let’s begin with some of the challenges of data lakes:

  • Data lakes are notoriously messy as everything gets dumped there. Sometimes, we may not have a rhyme or reason for dumping data there; we may be thinking we’ll need it at some later date.
  • Much of this mess is because your data lake will have a lot of small files and different data types. Because there are many small files that are not compacted, trying to read them in any shape or form is difficult, if not impossible.
  • Data lakes often contain bad data or corrupted data files so you can’t analyze them unless you go back and pretty much start over again.

This is where Delta Lake comes to the rescue! It delivers an open-source storage layer that brings ACID transactions to Apache Spark big data workloads. So, instead of the mess I described above, you have an over layer of your data lake from Delta Lake. Delta Lake provides ACID transactions through a log that is associated with each Delta table created in your data lake. This log records the history of everything that was ever done to that data table or data set, therefore you gain high levels of reliability and stability to your data lake.

Key Features of Delta Lake are:

  • ACID Transactions (Atomicity, Consistency, Isolation, Durability) – With Delta you don’t need to write any code – it’s automatic that transactions are written to the log. This transaction log is the key, and it represents a single source of truth.
  • Scalable Metadata Handling – Handles terabytes or even petabytes of data with ease. Metadata is stored just like data and you can display it using a feature of the syntax called Describe Detail which will describe the detail of all the metadata that is associated with the table. Puts the full force of Spark against your metadata.
  • Unified Batch & Streaming – No longer a need to have separate architectures for reading a stream of data versus a batch of data, so it overcomes limitations of streaming and batch systems. Delta Lake Table is a batch and streaming source and sink. You can do concurrent streaming or batch writes to your table and it all gets logged, so it’s safe and sound in your Delta table.
  • Schema Enforcement – this is what makes Delta strong in this space as it enforces your schemas. If you put a schema on a Delta table and you try to write data to that table that is not conformant with the schema, it will give you an error and not allow you to write that, preventing you from bad writes. The enforcement methodology reads the schema as part of the metadata; it looks at every column, data type, etc. and ensures what you’re writing to the Delta table is the same as what the schema represents of your Delta table – no need to worry about writing bad data to your table.
  • Time Travel (Data Versioning) – you can query an older snapshot of your data, provide data versioning, and roll back or audit data.
  • Upserts and Deletes – these operations are typically hard to do without something like Delta. Delta allows you to do upserts or merges very easily. Merges are like SQL merges into your Delta table and you can merge data from another data frame into your table and do updates, inserts, and deletes. You can also do a regular update or delete of data with a predicate on a table – something that was almost unheard of before Delta.
  • 100% Compatible with Apache Spark

Delta Lake is really a game changer and I hope you educate yourself more and start using it in your organization. You’ll find a great training resource from the Databricks community at: https://academy.databricks.com/category/self-paced

Or reach out to us at 3Cloud. Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

 

Brian CusterWhat is Delta Lake in Databricks?
Read More

Azure Synapse Analytics Now in GA and the Public Preview of Azure Purview

I’m here with some exiting news from Microsoft! Last week at a digital conference, Satya Nadella announced the general availability of Azure Synapse Analytics and the preview of Azure Purview, a unified data governance service. Azure Synapse Analytics has been gaining traction while in preview and adding Azure Purview gives businesses the ability to get the most of out their data and analytics.


Let’s talk about Azure Purview. This is a comprehensive data governance service that helps organizations discover all data across the organization. Demos at the digital conference showcased different ways you can use Purview for governance. Some key things are the ability to go multi-cloud, not only in Azure, but others as well. You can also connect with your on-prem environment and your Azure data assets.

For quite some time, those of us in the data disciplines have worked to inventory all the different aspects of data, like column, database and table names, etc., and put all those pieces into a common repository, often referred to as a data dictionary. Microsoft has been working for years to create a product that would be comprehensive enough to help most people with their governance and compliance needs. We’ve now got this with Azure Purview.

Some key highlights pointed out are:

  • A business glossary – no need to manually build a data dictionary.
  • Automated data classification – allows you to know things like data type (Social Security number for instance). You also have custom options and can schedule future scanning and classification on a routine basis. This way you’re getting continual updates, as opposed to a data dictionary where you get snapshot in time unless you manually update.
  • Cloud-based search facility – gives you the ability to find things quickly and easily across a broad series of data assets.
  • Data lineage and reporting – supports the end to end data lifecycle.
  • Power BI facilities

I feel Azure Purview is a very strong offering. Without it I would have either create my own versions of these pieces or using something like Embarcadero, which I used years ago. Another thing to note is that the experience is very similar to the canvas workspace experience in Azure Synapse Analytics, so if you’ve been working with that, it will feel very familiar.

The next part of Microsoft’s announcement is that Azure Synapse Analytics is now generally available. Azure Synapse Analytics is a limitless analytics service which brings together traditional data warehouse and big data analytics in one offering. It brings these together for a unified experience to ingest, prepare, manage, and serve data for immediate machine learning and BI applications. I, and many of our customers, have been using this great product a lot, so this going GA is surely exciting news.

Some noteworthy things with Azure Synapse Analytics are:

  • A new native cloud distributed SQL engine
  • Deep integration with Spark
  • Flexible query options such as serverless and dedicated
  • Integration with Power BI and machine learning
  • TPC-H benchmark at petabyte scale
  • Native Row Level Security (this is not possible with Amazon Redshift or Google BigQuery)
  • Native ML integration for the citizen data scientist
  • Code management – by that their talking about Azure DevOps as another piece that plays well with it.
  • Power BI integration to Teams which I found to be kind of cool

Again, great announcements with both the general availability of Azure Synapse Analytics and the public preview of Azure Purview. These two products combined empowers teams to remove data silos and leverage all data for analytics and data governance.

Need further help with these or any Azure product or service? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Rowland GoslingAzure Synapse Analytics Now in GA and the Public Preview of Azure Purview
Read More

Setting Variables Using DAX in Power BI

Welcome to another Azure Every Day focused on one of my favorite topics, using DAX in Power BI. In a previous post, I covered a couple of my favorite DAX formulas, CALCULATE and FILTER. Here, I’ll discuss variables and how to work those into CALCULATE/FILTER to expand on that and make it even more powerful. You may want to check out my previous post here, as I’ll be continuing using that demo.

  • I started by building in a CALCULATE/FILTER function in a table to calculate the beginning balance for 2017 for all my assets.
  • My code (see my video demo for code detail) tells it to calculate the sum of the beginning balance and I filtered the table where fiscal year equals 2017, and finance type equals assets.
  • Now, let’s say we want to know the assets for every year, not just 2017. To do this, we need to set the year into a variable and then it will calculate that asset for each individual year.
  • You use the VAR function to set the variable in the code and you’ll need to give it a name. In my case I’ll use YR for year and I’ll have that EQUAL to the fiscal year. It’s important to note that anytime you set a variable, you must hit return at the end.
  • Next, I’m going to update my FILTER. My code is calculating the sum of the beginning balance and I’m filtering the table where fiscal year equals 2017, but now I want to take that out and change it to fiscal year equals year.
  • How this works is when this goes through each row it will calculate for each year by using the variable instead of hard coding the year into there.
  • So previously we only had one outcome for 2017, now when we submit this, we’ll see four outcomes as we have four years’ worth of data, so we’re getting a calculation for our beginning balance for each year.
  • We can even step this up a bit if we wanted not only the beginning balance for each year, but also for each finance type. Maybe we don’t just want the assets but other values like equity, expense or liability.
  • All we need to do is set another variable that I’ll call FT for finance type. And instead of doing this for where finance type equals asset, we’ll say where finance type equals our variable.
  • Now we’ll have the calculation for every year for every individual finance type.


 

I hope this quick example helps you to start using these DAX formulas in your reports. The CALCULATE/FILTER functions and using variables are something I use all the time in Power BI.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Alex BeechSetting Variables Using DAX in Power BI
Read More

DAX CALCULATE and FILTER Functions in Power BI

We all know Power BI is a powerful analytics tool and it’s easy to create a new Power BI Desktop file, import some data into it and create reports that show valuable insight. Adding Data Analysis Expression, better known as DAX, enables you to get the most out of your data.
If you’re new to Power BI or DAX, DAX is a collection of formulas, or expressions, that are used to calculate and return one or more values. DAX helps you to use data already in your model and create new information from it that can solve real business problems.
When I first started using DAX functions, it brought my Power BI skills to the next level. I was able to tackle some analytical needs that I had struggled with in the past. I’m here to share a couple favorite formulas that I use all the time called the CALCULATE function and the FILTER function. Please be sure to watch my video included in this post as I walk through using this DAX formula.
• In my demo, I’m working with a data set to find the beginning balance for 2017 for our assets.
• To do that I need to sum a column in my table called beginning balance when fiscal year equals 2017 and when financial type equals asset.
• I’ll do this by using a combination of the CALCULATE function and the FILTER function. The CALCULATE function allows you to calculate a function on the entire table.
• In my code I’m going to CALCULATE the sum on our beginning balance. This would calculate the sum for the entire table.
• But we only want to calculate the sum for 2017 for just the assets and financial type. For this, once we have calculated the table, we need filter that table. Think of this FILTER function as making a digital table in the background.
• We need to FILTER it where fiscal year equals 2017 and where finance type equals asset. In my code, I’ll add FILTER for the function, and we need to tell it what table we are going to be filtering, in my case it’s the balance table. Then add where fiscal year equals 2017 and where finance type equals asset.
• Using these DAX functions, our result will show the beginning balance for our assets for 2017.
• My video shows you exactly how to write the code I used here, so be sure to check it out.

As you can see, this is super simple, and this formula allows you flexibility in how you write it. You can FILTER tables in many ways and use different functions within CALULATE. I hope you enjoyed this simple use case of these powerful DAX functions in Power BI.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or [email protected]

Alex BeechDAX CALCULATE and FILTER Functions in Power BI
Read More

Create a Custom Visual in Power BI with React

Welcome to another edition of Azure Every Day! I’m an App Development Consultant at 3Cloud and in this post I’ll show you how to create a custom visual in Power BI with React.
When creating a custom visual in Power BI, you use the TypeScript language. You’ll be fine with TypeScript if you are familiar with JavaScript. Once you’ve created a custom visual from scratch, you can test and debug your visuals online using live reports, as well as package and deploy your visual to a production report. You can check out the Microsoft documentation about custom visuals at https://powerbi.microsoft.com/en-us/developers/custom-visualization/ to learn more.
I’ll walk you through the process here, but also be sure to watch my video included at the end of this post.
• To get started, be sure you have node installed on your machine.
• Navigate to your project directory run the npm install command for Power BI visual tools. Once Power BI visual tools are installed, run pbiviz, new and then your project name.
• Next, run pbiviz start which will start your custom visualization in the local host.
• To debug your visualization, go to app.powerbi.com and create a new report (or use an existing one) and add the developer visual to the report.
• In order to add React elements to your custom visual, you’ll need to install React and ReactDOM. Import React and ReactDOM into the visual.ts file or whatever file you will be using to render the HTML elements to the DOM.
• Create a React element within the visual.ts. Also create the DOM element and add an element to the DOM that is referenced by this element.
• Next, create the React component using React.createElement of the component name that you’ve imported, then pass any props you would like in the second parameter of the React.createElement function.
• Finally, you can add the React element to the DOM by using ReactDOM.render and reference the React component that you built and the HTML element on the page that you would like to add the React element to.
• When you’re finished building and debugging your project, you can set the project and author details in the pbiviz.json file.
• Run the pbiviz package command to generate a pbiviz file to import into your Power BI report.
• Now you can go into your report and import a visual from file. Select the visual and add the data you would like to add to your visual and configure the settings you had previously.
• Please check out my video for code detail on all the above steps.

 

This post walked you through the steps to create a custom visual in Power BI using React. I hope you’ll give it a try.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or [email protected]

Tom WardCreate a Custom Visual in Power BI with React
Read More

How to Import Power BI Datasets

I’m here to talk about Power BI datasets. Currently, when you connect to a Power BI dataset in the Power BI Service, it will be in live connect mode, which limits what you can do with that data model. I’d like to tell you about leveraging existing Power BI datasets in a way that is not supported out of the box at this time.

In that live connect mode, you’ll be able to add additional measures, but you can’t add calculated columns or external data sources like when using a traditional Analysis Services model. Below is a workaround that you can use to meet ad hoc reporting needs, where you can leverage existing data models for users that do not have edit permissions to the underlying dataset itself or to the data source.

  • The first step in leveraging an existing Power BI dataset is to pull it into Excel. This allows us to flatten out the data, de-normalize it within an Excel Pivot Table and use that table as a data source for Power BI.
  • If we want to pull that data back into Power BI and add additional data, we start by analyzing in Excel and that will download a file for us.
  • Next, we can open our data connected Excel file. As I open this, it will be live connected to our data model. I can then generate a Pivot Table to flatten out the data that I’m interested in. In my demo, I bring in the total sales and a couple dimensions from the sales territory like sales territory country and sales territory group.
  • I also want to restrict this by a period of time, so I’ll bring in the calendar year. For this dataset, the last year of data is 2008 but that is only a partial year. I’ll bring in all of 2007 and I need to be sure I bring in enough granularity so I can see that 2008 is only a partial year. I also bring in the month.
  • At this point I have a dataset that is de-normalized but not Power BI friendly yet. I will need to flatten it out into a simple table.
  • To flatten it, I’ll go into the Pivot Table design and turn off my subtotals first and then my grand totals. Under Report Layout, I click on Show in Tabular Form and then click Repeat all Items and Labels.
  • Now, I have a table that is Power BI friendly. This way I don’t have to do a lot of manipulation to work with it.
  • The final step back in Power BI Desktop is to connect and import the Pivot Table that I created by using the Excel Connector. Then by pointing to the file that I just saved, I’m able to access the Pivot Table that is connected to our Power BI dataset.
  • The way that I formatted the Pivot Table allows us to see the data in a nice, clean table. But keep in mind that as I’m loading this data, it is not pulling in a fresh copy of information from the Power BI Service directly. Instead, it’s pulling in the static copy that exists in that Power BI Pivot Table.
  • So, I can’t refresh the data from here. If I want to get an updated set of data from the Power BI Service, I will have to go back to the Excel Pivot Table that is connected to the Power BI dataset and refresh it from there. Once I hit refresh, it will bring in an updated copy of that information via the Excel Pivot Table.
  • However, the main reason I’ve done this is to do some ad hoc data modeling and reporting off that existing Power BI dataset. I can also import additional data elements from SQL Server or bring in additional Excel Pivot Tables and join that data to this ad hoc model to meet my immediate reporting needs.

This simple work around gives you some flexibility in being able to leverage your existing Power BI datasets. In the future, I believe this type of work around will be unnecessary as we’ll be able to import existing Power BI datasets and add other elements to it natively out of the box. But until then, this workaround will allow you to get more out of what you have already built.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

 

 

Jeremy BlackHow to Import Power BI Datasets
Read More

A Success Story Using Auto ML in Power BI Premium

Azure Machine Learning and Power BI are two of the most influential tools out there. I’d like to share a story about how a colleague and I recently completed a project for a public school district in Georgia using these powerful tools.

The purpose of this project was to integrate machine learning and predictive analytics to be able to feed inputs that existed in their data about students such as classes they’d taken, attendance, grades, etc. and use this information to predict the likelihood of high school graduation. The district could then use this to proactively guide students and give them the help they needed.

We used the Auto ML feature in Power BI Premium. My colleague, Brian Custer, oversaw the machine learning model design. Auto ML is a straightforward process in Power BI. All that was needed was to create data using the information we wanted to look at that may have an impact on graduation and then we put this through a wizard in Auto ML. Once we did that, out came a model that was trained and ready to go for prediction.

Machine learning models exist in several different forms and leverage the capabilities that are built into Azure – we have Azure Machine Learning, Python and R, integrations with SQL Server, as well as Auto ML and other features within Power BI.

So, what exactly is Auto ML in Power BI? It’s a wizard driven machine learning interface that works off dataflows. You start off with a dataflow and you must have your training and prediction entities in your dataflow. This is passed to an Auto ML model that you choose and then it automatically trains the model for you and delivers predictions.

This is a very simple pipeline to set up. In our case we simply hooked up to the data that was in Azure Data Lake Gen2 storage and brought it into some SQL tables.

The school district we created this for couldn’t have been happier with the results and they are currently using this machine learning model to help predict the probability of high school graduation and ensure that students in the path of not graduating get the help they need to succeed.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Paul TurleyA Success Story Using Auto ML in Power BI Premium
Read More