By migrating the customer from Azure Data Lake Storage Gen1 to Azure Data Lake Storage Gen2, 3Cloud, formerly Pragmatic Works, helped them see a 40% savings in storage per month, totaling over $28,000 monthly.
If you have been in the BI industry as long as I have, then you know of Donald Farmer. He happens to be one of my heroes as a leader in the business intelligence and analytics field for the past two decades.
Many people ask the question, which is right for my company, on-premise or cloud modern data warehouse? In a recent webinar, David Hay, teaches you what you should look at to make the best decision for your organization between cloud vs on-premise modern data warehouse, as well as what the implementation process entails to stay within time and budget.
I have been on a long term Azure for BI implementation and took some time today for a “what went right”, “what went wrong” and “what I’ll do differently” introspection. Giving the outcome a bit of a positive twist, I’ll try not to repeat what I’ve already shared inData Architecture for BI Programsand Transitioning from Traditional to Azure Data Architectures. I actually thought I’d write about parameterized shared ADF datasets, or using Python to auto generate transform views and load store procedures. Instead, it appears that my fingers have reverted back to tool agnostic critical success points. (At least I know what I want to write about next!) My tool agnostic thoughts:
Go Get Your First Customer! Here is a movie quote for you: “Build it and they will come”. This thought process should be sent straight to Hades. Data architecture and analytics is not a Field of Dreams. Among other things, having an actual customer will result in a finished product that at least somebody wants. The best advertisement is word of mouth, and it is easier to sell a product that has at least one happy customer. It amazes me to see how many companies make this mistake. If you do not have one already, go get your first customer and put them on your design team. Now, how easy was that?!
Associate Each Data Architectural Decision with a Business Value In my most recent project, my colleague,Joshuha Owen, and I put together a slide deck affiliating business value with data architectural decisions. There were several slides that looked similar to thisfor business users–>
Data Governance – a hot topic on everyone’s minds. As I’m working on a BI project in the financial district in New York, it has me thinking about data governance and how it’s something we struggle with in the BI world.
A Bit of Intro If I recall correctly, I completed the first version of this data architecture diagram in 2012 when we used terms like “road map” and “blueprint” Back then, along with different terms, we were also using traditional SSIS, SSAS-MultiD and SSRS tools. Now we live in the world of cloud everything, although we are still driving from SRC-to-DST (source to destination). I’m up for whatever terminology you want to use, but can we agree that we are surely on a different highway? For my classical BI Blueprint, clickhere, but to see an Azure road map for BI, please take a look below.
Disclaimer:I create a different diagram for every engagement, so think of this as a suggestion, not a mold.
Unless you’ve been living under a rock, you’ve heard of Power BI. In case you don’t know, Power BI is a business analytics solution from Microsoft that lets you visualize your data and share insights across your organization or embed them in your app or website. You can connect to hundreds of data sources and bring your data to life with live dashboards and reports.
Confession:I put a lot of subtexts in this blog post in an attempt to catch how people may be describing their move from SSIS to ADF, from SQL DBs, to SQL DWs or from scheduled to event-based data ingestion. The purpose of this post is to give you a visual picture of how our well loved “traditional” tools of on-prem SQL Databases, SSIS, SSAS and SSRS are being replaced by the Azure tool stack. If you are moving form “Traditional Microsoft” to “Azure Microsoft” and need a road map, this post is for you.
Summary of the Matter:If you only read one thing, please read this: transitioning to Azure is absolutely “doable”, but do not let anyone sell you “lift and shift”. Azure data architecture is a new way of thinking. Decide to think differently.
First Determine Added Value: Below are snippets from a slide deck I shared during Pragmatic Work’s 2018 Azure Data Week. (You can still sign up for the minimal cost of $29 and watch all 40 recorded sessions, just clickhere.) However, before we begin, let’s have a little chat. Why in the world would anyone take on an Azure migration if their on-prem SQL database(s) and SSIS packages are humming along with optimum efficiency? The first five reasons given below are my personal favorites.
Cost (scale up, scale down)
Event Based File Ingestion
File based history (SCD2 equivalent but in your Azure Data Lake)
Support for Near Real Time Requirements
Support for Unstructured Data
Large Data Volumes
Offset Limited Local IT Resources
Data Science Capabilities
Development Time to Production
Support for large audiences
Each of the reasons given above are a minimum one hour working session on their own, but I’m sharing my thoughts in brief in an effort to help you to get started compiling our own list. Please also look at the following diagram (Figure 1) and note two things: a.) the coinciding “traditional” components and b.) the value add boxed in red.