Welcome to another edition of our Azure Every Day mini-series on Databricks. In this post, I’ll walk you through creating a key vault and setting it up to work with Databricks. I’ve created a video demo where I will show you how to: set up a Key Vault, create a notebook, connect to a database, and run a query.
Do you want to learn more about Cosmos DB for SQL Server Admins and Developers? In a recent webinar with consultant Mike Donnelly, he’ll give you a quick outline of Cosmos DB, Microsoft’s multi-model database service, and why to use it. Cosmos DB is:
Is there a simple process for converting your existing datasets into a dataflow? The answer is yes, and I’ll share some tips here for doing this in the Power BI Service without having to do a lot of rework.
Security is crucial and at the forefront of everyone’s minds in business, especially if you’re a DBA. Of course, everyone should be mindful of security but for the DBA, it’s your job. The DBA is typically thought of as the gatekeeper of business data and we all know that the amount of data is growing in leaps and bounds.
Do you want to learn how to copy your existing VM to your new Azure Tenant? Unfortunately, it’s not quite as simple as just restoring the VM in your brand-new Azure space and enjoying the rainbows and sunshine that follow along afterwards. Prior planning and preparation are the key. This webinar by Consultant and DBA Michael Wall covers the steps that can be taken before the move as well as some tips to help with dealing with any issues afterwards.
Do you want to learn how to migrate your workloads to Azure? In a recent webinar, Practice Manager and BI Architect, Bob Rubocki, discusses the best tips and tricks for migrating data warehouse workloads to Azure.
Azure SQL Data Warehouse (DW) has been improving its capabilities day-by-day. This relational database platform used by Microsoft is known to be faster and cheaper than other major cloud data warehouse solutions.
Do you want to learn how to integrate SQL Server in Azure? In a recent webinar, Steve Hughes, Director of Consulting at Pragmatic Works, shows how what you choose impacts security, capability, performance, and costs. The cloud is a moving target, this webinar will give you an understanding of your SQL Server options on Azure.
Sometimes I get so involved in my repeatable processes and project management that I forget to look up. Such is the case of the December 2018 ability to parameterize linked services. I could not rollback and rework all the ADF components this impacted which had already gone to production, but oh hooray! Moving forward, we can now have one linked service per type, one dataset per linked service and one pipeline per ingestion pattern. How sweet is that? Au revoir to the days of one SSIS package per table destination.
This is blog post 3 of 3 on using parameters in Azure Data Factory (ADF). Blog post #1was about parameterizing dates and incremental loads. Blog post #2was about table names and using a single pipeline to stage all tables in a source. Today I am talking about parameterizing linked services.
Disclaimer:Not all linked service types can be parameterized at this time. This feature only applies to eight data stores:
Azure SQL Database
Azure SQL Data Warehouse
Azure Database for MySQL
Concept Explained: The concept is pretty straightforward and if our goal is to have one linked service per type, a parameterized Azure SQL Database linked service might look like this: