Azure

Azure Data Factory 101

Are you just starting out with Azure Data Factory? In this post, I’ll give you an introduction to Azure Data Factory, covering what it is, how it works and how to set it up. Within the video included this post is a short demo of how to create and access the platform.

What is Data Factory?

  • Here is a clear definition that I found from the Cloud Academy. Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows in the cloud for orchestrating and automating data movement and data transformation.

How Data Factory works?

  • The key components of Data Factory are pipelines, activities, and datasets.
  • Pipelines are made up of activities. There are 3 types of activities:
    • Movement – the copy activity
    • Transformation – including Azure Functions, HD Insight, Stored Procedures, Spark and Databricks
    • Control – ForEachLoops, If Condition Branching, Wait Times, and Validations
  • Datasets represent the inputs and the outputs of the activities.
  • Linked Services – these are the connection strings and authentication for all types of sources for the data sets.
  • Data Flows – are the results of the datasets where you can apply logic and transform the data.
    • Mapping Data Flows are graphical with drag and drop functionality.
    • Wrangling Data Flows are more like using Power Query or M.
  • Integration Runtime – allows you to do data integration across different network environments. There are three types of runtimes: Azure, Self-hosted, and Azure SSIS. Depending on where the data is that you need to copy will determine which of these is appropriate for the use case.

In the video below, I provide a brief walk through of how to access and create in Azure Data Factory. Please check it out, as I think it is a good resource for those just starting out.

Our Azure Every Day series is another great resource. 3Cloud consultants posts weekly blogs and videos around all things Azure. You’ll find all current and past blogs on our website or by clicking here.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Leslie AndrewsAzure Data Factory 101
Read More

Tabular Modeling & Calculation Techniques Beyond the Basics in Power BI

Do you want to learn how to address real-world issues with Power BI and DAX? In a recent webinar, Principal Consultant, Paul Turley, teaches you what you need to know. Paul covers many-to-many relationships, using disconnected tables and takes a look at using composite models.

This presentation takes a deep dive into Tabular modeling and calculation and includes demos. Paul’s agenda covers:

  • Dimensional modeling basics (just a quick run through as this is a more advanced presentation)
  • Filter propagation in Tabular models
  • Effects of single and bi-directional relationships as you are creating relationships in your data model
  • A look at scenarios where data model relationships can or cannot be used to achieve requirements or may or may not solve business problems
  • Dynamic measures built on a dimensional data model
  • A demo on DirectQuery in a Gen 1 composite model with Import mode
  • A first look at new Gen 2 DirectQuery using Power BI datasets which is the next generation of composite model capabilities which is now in public preview

All these topics and demos are based on a business requirement or business problem context, so the presenter will set up a business problem and work through how it can be solved with dimensional modeling.

So, if you’re looking to learn about using Power BI and DAX to address real-world business issues, then this webinar is for you. You can watch the complete webinar below.

Be sure to join us each week for our free webinars that happen every Tuesday at 11 AM ET. Our webinars cover a wide range of Azure topics and are presented by industry experts. Check out what webinars are coming up on our events calendar.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

3CloudTabular Modeling & Calculation Techniques Beyond the Basics in Power BI
Read More

A Few Tips When Creating Power BI Mobile Reports

Power BI is the best BI and reporting tool out there, and so is Power BI Mobile. Microsoft has made mobile access a first-class experience in the Power BI platform. All the heavy lifting like authentication, data security and network connectivity is all automatically delivered out of the box.

Just download the app and sign in once, then sit back and enjoy access to all your reports and dashboards. You’ll also get integration with your mobile device’s virtual assistant, so you can ask Siri for the sales report and she’ll pull it up for you to view.

Awesome stuff, right? But as a developer, you may not know how to deliver an optimized view of your analysis for mobile consumption. Here, I’ll walk you through creating a mobile version of your report. I’ll point out the major features, as well as some valuable tips and tricks.

  • In the Power BI Desktop view, you can look at the design for the mobile view by clicking on View and then Mobile Layout. This gives me a blank canvas and on the right side is a visualizations pallet, which at this point only lists the visualizations from my desktop.
  • I can pull over one of my desktop visuals and add my report title and company logo, but honestly, it’s not that great. I want to change the way it looks or even create a totally different visual just for my mobile layout, but unfortunately, I’m stuck with the visuals that I have on my desktop.
  • To do this, here’s my first big tip for a work around; you’ll need to create visuals, put them on your desktop version, and then hide them. Let’s walk through this:
    • The title bar ‘Sales Analysis’ on my mobile view is huge so I want to make it smaller. I go back to my desktop and make a copy of my title bar and make one with a smaller font size.
    • Next, I go over to my mobile layout, get rid of the large one and add the smaller one.
    • Here’s the problem, when I go back to the desktop, I want to get rid of that smaller font title text box and when I delete it, it will disappear from the mobile view as well.
    • If I try to hide it in the desktop view, it will also hide it in my mobile view.
    • So, my trick is to hide that text box with the smaller font behind a larger text box in my desktop view. All I need to do is lineup that mobile only text box on top of the other one, go to Format and click Send to the Back. You can also do this in the Selection panel on the right side and drag the text box to the bottom of the list, and it will put it behind everything else higher on the list.
  • Next, I want to create a visual that is more meaningful in my mobile view. In my demo, I create a copy of my bar chart visual and I filter it to show only the most recent 3 months of my data.
  • I’ll shrink that down a bit and hide it behind the larger bar chart on my desktop. That new 3-month bar chart will appear in my mobile visualizations pallet and I can pull that over onto the canvas.
  • When I save this and publish it, I’ve created one version that users who view it from the desktop will get the desktop view. Users that view it from their phone will see the desktop view if they are in horizontal mode but will get the mobile view if they are in vertical mode.

When I’m on my phone and open the Power BI app, I see two different icons. The mobile version I created and published has a different icon with a phone on it. This way end users can quickly identify the version that is intended for the mobile use. I can rotate my phone horizontally and see the desktop version, but the goal of creating a mobile view of reports is to just have distinct data (in my case, the most recent 3 months) available for a quick update.

I can also add a quick Siri shortcut by simply telling Siri that when I say, ‘open mobile sales report’, my phone will go right to that mobile report. These tips should be super helpful when creating Power BI mobile reports.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Chris SilvernailA Few Tips When Creating Power BI Mobile Reports
Read More

Introduction to Azure Automation

Do you want to learn how to automate administrative tasks in your Azure environment? In a recent webinar, we discussed how to automate various processes in Azure using PowerShell, Runbooks, and Automation Modules. When you know these techniques, you’ll be able to do things such as automate the starting and stopping of services, even processing Analysis Services Models!

This demo-heavy presentation will explain and help you understand Azure Automation and Runbooks. It will show you how to create a PowerShell Runbook and discuss options to setup a Runbook. The demo will take you through the steps, including testing and publishing a sample Runbook code and how to run a Runbook job.

So, grab 40 minutes and watch this demo if you want to learn about how to integrate automation within your Azure environment, so you can save time on those routine, administrative tasks and focus on more important things. You can watch the complete webinar below.

Our free webinars happen every Tuesday at 11 am EST and cover a variety of Azure topics that are presented by industry experts and consultants. Be sure to check out our monthly calendar for upcoming webinars.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

3CloudIntroduction to Azure Automation
Read More

Avoiding Issues When Generating Time/Date in Power BI

Have you ever run into hurdles when trying to generate time and date in Power BI? I know I have, so I’m here to help and to demonstrate the differences between generating time and date in Power BI Desktop vs in the Power BI Service.

A few key points before I dig in:

When we generate time and date in Power BI Desktop using the native Power BI tools, it pulls that information from our local computer settings.

In the Power BI Service, the Service generally exists in a different server time and date than your local machine. Therefore, there will be some differences between the times as they appear in the desktop file and how they appear in the Service.

Currently, there is no native functionality to handle Daylight Savings Time changes. Keep this in mind when you start to generate your own time and dates – you’ll need to create tools that take this into account.

What I’ve done will be much easier to see by watching my video instead of trying to explain in text, as it could be confusing. Here’s a brief overview of my video.

  • In my video, you’ll see that I went into Power Query and copied the different queries that I used to generate time and date.
  • The first one is generating time and date in Power Query using DateTImeZone.LocalNow to pull the local time. Remember, when operating in the PBI Desktop, this is pulling information from my local machine.
  • What I’m going to do is reuse some of the formulas and look at how we can switch the time and what those changes look like as I go through the queries I’ve highlighted.
  • In my demo, you’ll see that rather than publishing this file and jumping in the Power BI Service to look at it, I’ve taken the queries that I’ve used to generate my times, put them in the Service as a data flow, and then imported that dataflow into a Power BI Desktop file.
  • So, I have 2 pages; one shows the information from the data flow and the other is the information generated locally. This way I can show side by side what the changes are going to look like.
  • I’ll walk you through the steps of these scenarios and the queries I used:
    • Manually switching time using #duration
    • Modifying date/time to reflect Eastern Standard Time (-4 UTC) using SwitchZone
    • Modifying time/date in Power Query using SwitchZone and FixedLocalNow (time at start of query execution)
    • Adjusting time -4 or -5 based on daylight saving calendar

In summary, this post covers the changes to date/time as we move from Power BI Desktop to the Service and what are the best formulas to use when we are generating our own time. You’ll see that the key takeaway is the best way is to use SwitchZone when adjusting between the Desktop and the Service.

I think you’ll find this helpful when running into date/time hurdles in Power BI.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Steve WiseAvoiding Issues When Generating Time/Date in Power BI
Read More

3Cloud Earns Microsoft’s Azure Advanced Specializations in Modernization of Web Applications, Windows Virtual Desktop & Windows Server and SQL Server Migration

3Cloud is proud to announce it has earned Microsoft Azure Advanced Specializations in Modernization of Web Applications, Windows Virtual Desktop and Windows Server and SQL Server Migration to Azure. By earning these advanced specializations, we are able to further differentiate ourselves and validate our deep knowledge, extensive experience, and expertise within these areas of Microsoft Azure. We are thrilled to have earned advanced specializations in the following areas to best meet the needs of our customers:

Modernization of Web Applications: Only partners that meet stringent criteria around customer success and staff skilling, as well as pass a third-party audit of their web workload deployment and management practices, including their ability to implement Azure App Service, are able to earn the Modernization of Web Applications in Microsoft Azure advanced specialization.

Windows Virtual Desktop: This is Microsoft’s newest advanced specialization. Windows Virtual Desktop has increased in demand as organizations across the globe are working remotely and many are transitioning to desktop-as-a-service to enable their employees. Partners must demonstrate extensive knowledge and experience with deploying, optimizing, and securing virtual desktop infrastructure on Azure.

Windows Server and SQL Server Migration to Microsoft Azure: This advanced specialization validates a solution partner’s experience, knowledge, and expertise in migrating Windows Server and SQL Server-based workloads to Azure. Partners must meet strict criteria around staff skilling, customer success and pass a third-party audit of their migration practices.

As companies look to modernize and take full advantage of the benefits of the cloud, they are looking for a partner with advanced skills to migrate, optimize, and manage their environments. 3Cloud continually strives to be that partner in guiding clients on their Azure adoption journey.

Jim Hughes, VP, Solution Architecture at 3Cloud stated, “As we continue our mission in helping our customers achieve The Ultimate Azure Experience, we are excited to announce that 3Cloud has been awarded the Microsoft Windows Virtual Desktop Advanced Specialization. With this award, we became the first US Microsoft Partner to achieve this goal. In addition, 3Cloud has also earned Advanced Specializations in Modernization of Web Applications and Windows Server and SQL Server Migration. I’m excited and proud of these accomplishments and to be part of an incredible team that provides our clients with top-notch guidance in their journey to Azure.”

3Cloud guides clients on their journey to the cloud by providing the Ultimate Azure Experience. Our experience, reputation for excellence, and on-going support define us as a leader in the Azure ecosystem. Please reach out if you have questions about any of our solution offerings or would like to discuss how we help your business at every stage of your Azure journey at [email protected] or call (888) 88-AZURE. Learn more about Microsoft Azure Advanced Specializations in the link below.

https://aka.ms/aaspartners

 

3Cloud3Cloud Earns Microsoft’s Azure Advanced Specializations in Modernization of Web Applications, Windows Virtual Desktop & Windows Server and SQL Server Migration
Read More

What is Delta Lake in Databricks?

If you’re not familiar with Delta Lake in Databricks, I’ll cover what you need to know here. Delta Lake is a technology that was developed by the same developers as Apache Spark. It’s designed to bring reliability to your data lakes and provided ACID transactions, scalable metadata handling and unifies streaming and batch data processing.

Let’s begin with some of the challenges of data lakes:

  • Data lakes are notoriously messy as everything gets dumped there. Sometimes, we may not have a rhyme or reason for dumping data there; we may be thinking we’ll need it at some later date.
  • Much of this mess is because your data lake will have a lot of small files and different data types. Because there are many small files that are not compacted, trying to read them in any shape or form is difficult, if not impossible.
  • Data lakes often contain bad data or corrupted data files so you can’t analyze them unless you go back and pretty much start over again.

This is where Delta Lake comes to the rescue! It delivers an open-source storage layer that brings ACID transactions to Apache Spark big data workloads. So, instead of the mess I described above, you have an over layer of your data lake from Delta Lake. Delta Lake provides ACID transactions through a log that is associated with each Delta table created in your data lake. This log records the history of everything that was ever done to that data table or data set, therefore you gain high levels of reliability and stability to your data lake.

Key Features of Delta Lake are:

  • ACID Transactions (Atomicity, Consistency, Isolation, Durability) – With Delta you don’t need to write any code – it’s automatic that transactions are written to the log. This transaction log is the key, and it represents a single source of truth.
  • Scalable Metadata Handling – Handles terabytes or even petabytes of data with ease. Metadata is stored just like data and you can display it using a feature of the syntax called Describe Detail which will describe the detail of all the metadata that is associated with the table. Puts the full force of Spark against your metadata.
  • Unified Batch & Streaming – No longer a need to have separate architectures for reading a stream of data versus a batch of data, so it overcomes limitations of streaming and batch systems. Delta Lake Table is a batch and streaming source and sink. You can do concurrent streaming or batch writes to your table and it all gets logged, so it’s safe and sound in your Delta table.
  • Schema Enforcement – this is what makes Delta strong in this space as it enforces your schemas. If you put a schema on a Delta table and you try to write data to that table that is not conformant with the schema, it will give you an error and not allow you to write that, preventing you from bad writes. The enforcement methodology reads the schema as part of the metadata; it looks at every column, data type, etc. and ensures what you’re writing to the Delta table is the same as what the schema represents of your Delta table – no need to worry about writing bad data to your table.
  • Time Travel (Data Versioning) – you can query an older snapshot of your data, provide data versioning, and roll back or audit data.
  • Upserts and Deletes – these operations are typically hard to do without something like Delta. Delta allows you to do upserts or merges very easily. Merges are like SQL merges into your Delta table and you can merge data from another data frame into your table and do updates, inserts, and deletes. You can also do a regular update or delete of data with a predicate on a table – something that was almost unheard of before Delta.
  • 100% Compatible with Apache Spark

Delta Lake is really a game changer and I hope you educate yourself more and start using it in your organization. You’ll find a great training resource from the Databricks community at: https://academy.databricks.com/category/self-paced

Or reach out to us at 3Cloud. Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

 

Brian CusterWhat is Delta Lake in Databricks?
Read More

Azure Synapse Analytics Now in GA and the Public Preview of Azure Purview

I’m here with some exiting news from Microsoft! Last week at a digital conference, Satya Nadella announced the general availability of Azure Synapse Analytics and the preview of Azure Purview, a unified data governance service. Azure Synapse Analytics has been gaining traction while in preview and adding Azure Purview gives businesses the ability to get the most of out their data and analytics.


Let’s talk about Azure Purview. This is a comprehensive data governance service that helps organizations discover all data across the organization. Demos at the digital conference showcased different ways you can use Purview for governance. Some key things are the ability to go multi-cloud, not only in Azure, but others as well. You can also connect with your on-prem environment and your Azure data assets.

For quite some time, those of us in the data disciplines have worked to inventory all the different aspects of data, like column, database and table names, etc., and put all those pieces into a common repository, often referred to as a data dictionary. Microsoft has been working for years to create a product that would be comprehensive enough to help most people with their governance and compliance needs. We’ve now got this with Azure Purview.

Some key highlights pointed out are:

  • A business glossary – no need to manually build a data dictionary.
  • Automated data classification – allows you to know things like data type (Social Security number for instance). You also have custom options and can schedule future scanning and classification on a routine basis. This way you’re getting continual updates, as opposed to a data dictionary where you get snapshot in time unless you manually update.
  • Cloud-based search facility – gives you the ability to find things quickly and easily across a broad series of data assets.
  • Data lineage and reporting – supports the end to end data lifecycle.
  • Power BI facilities

I feel Azure Purview is a very strong offering. Without it I would have either create my own versions of these pieces or using something like Embarcadero, which I used years ago. Another thing to note is that the experience is very similar to the canvas workspace experience in Azure Synapse Analytics, so if you’ve been working with that, it will feel very familiar.

The next part of Microsoft’s announcement is that Azure Synapse Analytics is now generally available. Azure Synapse Analytics is a limitless analytics service which brings together traditional data warehouse and big data analytics in one offering. It brings these together for a unified experience to ingest, prepare, manage, and serve data for immediate machine learning and BI applications. I, and many of our customers, have been using this great product a lot, so this going GA is surely exciting news.

Some noteworthy things with Azure Synapse Analytics are:

  • A new native cloud distributed SQL engine
  • Deep integration with Spark
  • Flexible query options such as serverless and dedicated
  • Integration with Power BI and machine learning
  • TPC-H benchmark at petabyte scale
  • Native Row Level Security (this is not possible with Amazon Redshift or Google BigQuery)
  • Native ML integration for the citizen data scientist
  • Code management – by that their talking about Azure DevOps as another piece that plays well with it.
  • Power BI integration to Teams which I found to be kind of cool

Again, great announcements with both the general availability of Azure Synapse Analytics and the public preview of Azure Purview. These two products combined empowers teams to remove data silos and leverage all data for analytics and data governance.

Need further help with these or any Azure product or service? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

Rowland GoslingAzure Synapse Analytics Now in GA and the Public Preview of Azure Purview
Read More

Getting Started with Custom Power BI Report Themes

Do you want to learn how to create visual consistency across your reports and eliminate the monotonous task of setting common visual properties? Power BI report themes are more than just a color palette for your reports. With report themes, you can apply design changes to your entire report. All the visuals in your report will use the colors and formatting of the theme you’ve selected as a default. You can make design changes like using corporate colors, changing icon sets or applying new default visual formatting.

In a recent webinar, 3Cloud Senior Consultant, Jeremy Black, takes you on a deep dive into Power BI report themes. Jeremy covers the two types of report themes: built-in and custom. With custom themes you can customize via Power BI Desktop and/or by manually manipulating the JSON theme file used by Power BI Desktop. He’ll also cover the benefits of creating custom report themes, as well as their limitations.

Lastly, the presentation will delve into the Power BI report theme building blocks including data and structural colors, text, and visual styles. Most of this webinar is spent on demos of examples of how to begin customizing your report themes.

You can watch the complete webinar below.


So, if you’re looking to create visual consistency in your reports without spending tons of time setting common visual properties for your reports, this webinar is for you. Join us for our free weekly webinars covering a variety of Azure topics every Tuesday at 11 a.m. ET. Check out our events calendar for upcoming topics.

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

3CloudGetting Started with Custom Power BI Report Themes
Read More

Azure Synapse 101

 

Do you want to learn how to bring together enterprise data warehousing and big data analytics? In a recent webinar, 3Cloud consultant Mike Donnelly, explores the components included with an Azure Synapse Workspace. This presentation goes through all the various pieces of Synapse, what it is and what it can do, as well as why people will be using this more and more as people build modern data warehouse.
When you search Azure Synapse Analytics, you’ll see there is Azure Synapse Analytics (formerly known as Azure SQL DW) which has been around for a bit as a data warehouse solution. But you will also find Azure Synapse Analytics Workspaces Preview. This preview resource is the focus today and many of the pieces it includes will be covered (and demoed), including:
• SQL Pools (formerly known as SQL Data Warehouse)
• Built-in SQL Pool (aka SQL On-Demand/Serverless)
• Azure Data Lake Storage (Gen2)
• Spark Compute Pools (not Azure Databricks)
• Integrate (Azure Data Factory)
If you’re interested in learning more about Azure Synapse Analytics and why it will become the tool of choice for building modern data warehouse, this webinar is for you. Watch the complete webinar below.

Be sure to watch for our free weekly webinars that happen every Tuesday at 11 AM EST. Check out our events calendar on our website https://www.3cloudsolutions.com/resources/events-calendar/

Need further help? Our expert team and solution offerings can help your business with any Azure product or service, including Managed Services offerings. Contact us at 888-8AZURE or  [email protected].

3CloudAzure Synapse 101
Read More