Overview

Implement a Data Analytics Solution with Azure Databricks is a 1-day intermediate course designed for data professionals who want to strengthen their skills in distributed data processing with Spark and Databricks on Azure. By the end of this learning path, you'll have built solid intermediate to advanced skills in both Databricks and Spark on Azure. You're able to ingest, transform, and analyse large-scale datasets using Spark DataFrames, Spark SQL, and PySpark, giving you confidence in working with distributed data processing. Within Databricks, you know how to navigate the workspace, manage clusters, and build and maintain Delta tables.

You'll also be capable of designing and running ETL pipelines, optimizing Delta tables, managing schema changes, and applying data quality rules. In addition, you learn how to orchestrate workloads with Lakeflow Jobs and pipelines, enabling you to move from exploration to automated workflows. Finally, you gain familiarity with governance and security features, including Unity Catalog, Purview integration, and access management, preparing you to operate effectively in production-ready data environments.

Read more +

Prerequisites

Participants should have:

  • Working knowledge of the fundamentals and syntax of Python and SQL, including Python scripting and SQL filter, aggregate and join queries
  • A basic understanding of common file formats, JSON, CSV and Parquet
  • Familiarity with the Azure portal and foundational storage services
  • A general awareness of data concepts such as batch versus streaming processing and structured versus unstructured data

Target audience

This course is designed for professionals who are interested in working with the Databricks platform. It is well suited to in-training or current data analysts who have prior experience managing data but limited exposure to Databricks.

Read more +

Delegates will learn how to

By the end of this course, learners will be able to:

  • Ingest, transform, and analyse large-scale datasets using Spark DataFrames, Spark SQL, and PySpark.
  • Navigate the Databricks workspace, managing clusters, and building/maintaining Delta tables.
  • Design and running ETL pipelines, managing schema changes, applying data quality rules, and optimizing Delta tables.
  • Orchestrate automated workflows using Lakeflow Jobs and pipelines.
  • Gain familiarity with Unity Catalog, Purview integration, and access management to work confidently in production-ready data environments.
Read more +

Outline

Module 1: Explore Azure Databricks

Azure Databricks is a cloud service that provides a scalable platform for data analytics using Apache Spark.

  • Introduction
  • Get started with Azure Databricks
  • Identify Azure Databricks workloads
  • Understand key concepts
  • Data governance using Unity Catalog and Microsoft Purview
  • Exercise - Explore Azure Databricks
  • Module assessment
  • Summary

Module 2: Perform Data Analysis with Azure Databricks

Learn how to perform data analysis using Azure Databricks. Explore various data ingestion methods and how to integrate data from sources like Azure Data Lake and Azure SQL Database. This module guides you through using collaborative notebooks to perform exploratory data analysis (EDA), so you can visualize, manipulate, and examine data to uncover patterns, anomalies, and correlations.

  • Introduction
  • Ingest data with Azure Databricks
  • Data exploration tools in Azure Databricks
  • Data analysis using DataFrame APIs
  • Exercise - Explore data with Azure Databricks
  • Module assessment
  • Summary

Module 3: Use Apache Spark in Azure Databricks

Azure Databricks is built on Apache Spark and enables data engineers and analysts to run Spark jobs to transform, analyze and visualize data at scale.

  • Introduction
  • Get to know Spark
  • Create a Spark cluster
  • Use Spark in notebooks
  • Use Spark to work with data files
  • Visualize data
  • Exercise - Use Spark in Azure Databricks
  • Module assessment
  • Summary

Module 4: Manage data with Delta Lake

Delta Lake is a data management solution in Azure Databricks providing features including ACID transactions, schema enforcement, and time travel ensuring data consistency, integrity, and versioning capabilities.

  • Introduction
  • Get started with Delta Lake
  • Create Delta tables
  • Implement schema enforcement
  • Data versioning and time travel in Delta Lake
  • Data integrity with Delta Lake
  • Exercise - Use Delta Lake in Azure Databricks
  • Module assessment
  • Summary

Module 5: Build Lakeflow Declarative Pipelines

Building Lakeflow Declarative Pipelines enables real-time, scalable, and reliable data processing using Delta Lake's advanced features in Azure Databricks.

  • Introduction
  • Explore Lakeflow Declarative Pipelines
  • Data ingestion and integration
  • Real-time processing
  • Exercise - Create a Lakeflow Declarative Pipeline
  • Module assessment
  • Summary

Module 6: Deploy workloads with Lakeflow Jobs

Deploying workloads with Lakeflow Jobs involves orchestrating and automating complex data processing pipelines, machine learning workflows, and analytics tasks. In this module, you learn how to deploy workloads with Databricks Lakeflow Jobs.

  • Introduction
  • What are Lakeflow Jobs?
  • Understand key components of Lakeflow Jobs
  • Explore the benefits of Lakeflow Jobs
  • Deploy workloads using Lakeflow Jobs
  • Exercise - Create a Lakeflow Job
  • Module assessment
  • Summary

Exams and assessments

There are no formal examinations within this course. There will be a module review and summary following the practical hands-on lab, quiz and slide-deck deliveries. This will further enforce learning and support additional resource finds, for continued learning and development.

Hands-on learning

Within this course there are opportunities for learners to engage in hands-on labs to support module learning.

In addition, each module will also have a quiz, to support knowledge capture.

Read more +

Databricks training partner

Maximise your data and AI potential with Databricks certified training. Bridge skills gaps across your organisation to accelerate data-driven innovation, by enabling teams to scale insights and deploy AI for business growth.

Need to know

Frequently asked questions

How can I create an account on myQA.com?

There are a number of ways to create an account. If you are a self-funder, simply select the "Create account" option on the login page.

If you have been booked onto a course by your company, you will receive a confirmation email. From this email, select "Sign into myQA" and you will be taken to the "Create account" page. Complete all of the details and select "Create account".

If you have the booking number you can also go here and select the "I have a booking number" option. Enter the booking reference and your surname. If the details match, you will be taken to the "Create account" page from where you can enter your details and confirm your account.

Find more answers to frequently asked questions in our FAQs: Bookings & Cancellations page.

How do QA’s virtual classroom courses work?

Our virtual classroom courses allow you to access award-winning classroom training, without leaving your home or office. Our learning professionals are specially trained on how to interact with remote attendees and our remote labs ensure all participants can take part in hands-on exercises wherever they are.

We use the WebEx video conferencing platform by Cisco. Before you book, check that you meet the WebEx system requirements and run a test meeting to ensure the software is compatible with your firewall settings. If it doesn’t work, try adjusting your settings or contact your IT department about permitting the website.

How do QA’s online courses work?

QA online courses, also commonly known as distance learning courses or elearning courses, take the form of interactive software designed for individual learning, but you will also have access to full support from our subject-matter experts for the duration of your course.

Once you have purchased the Online course and have completed your registration, you will receive the necessary details to enable you to immediately access it through our e-learning platform and you can start to learn straight away, from any compatible device. Access to the online learning platform is valid for one year from the booking date.

All courses are built around case studies and presented in an engaging format, which includes storytelling elements, video, audio and humour. Every case study is supported by sample documents and a collection of Knowledge Nuggets that provide more in-depth detail on the wider processes.

When will I receive my joining instructions?

Joining instructions for QA courses are sent two weeks prior to the course start date, or immediately if the booking is confirmed within this timeframe. For course bookings made via QA but delivered by a third-party supplier, joining instructions are sent to attendees prior to the training course, but timescales vary depending on each supplier’s terms. Read more FAQs.

When will I receive my certificate?

Certificates of Achievement are issued at the end the course, either as a hard copy or via email. Read more here.

Let's talk

A member of the team will contact you within 4 working hours after submitting the form.

By submitting this form, you agree to QA processing your data in accordance with our Privacy Policy and Terms & Conditions. You can unsubscribe at any time by clicking the link in our emails or contacting us directly.