Overview

This course covers methods and practices for implementing and managing enterprise-scale data analytics solutions using Microsoft Fabric. Students will build on existing analytics experience and will learn how to use Microsoft Fabric components, including lakehouses, data warehouses, notebooks, dataflows, data pipelines, and semantic models, to create and deploy analytics assets.
This course is best suited for those who have the PL-300 certification or similar expertise in using Power BI for data transformation, modeling, visualization, and sharing. Also, learners should have prior experience in building and deploying data analytics solutions at the enterprise level.
Audience Profile
The primary audience for this course is data professionals with experience in data modeling, extraction, and analytics. DP-600 is designed for professionals who want to use Microsoft Fabric to create and deploy enterprise-scale data analytics solutions.
Read more

Prerequisites

This course is best suited for those who have the PL-300 certification or similar expertise in using Power BI for data transformation, modeling, visualization, and sharing. Also, learners should have prior experience in building and deploying data analytics solutions at the enterprise level.

Read more

Outline

MODULE 1: Ingest Data with Dataflows Gen2 in Microsoft Fabric
Data ingestion is crucial in analytics. Microsoft Fabric's Data Factory offers Dataflows (Gen2) for visually creating multi-step data ingestion and transformation using Power Query Online.
Learning objectives
In this module, you'll learn how to:
  • Describe Dataflow (Gen2) capabilities in Microsoft Fabric
  • Create Dataflow (Gen2) solutions to ingest and transform data
  • Include a Dataflow (Gen2) in a pipeline
  • Introduction
  • Understand Dataflows (Gen2) in Microsoft Fabric
  • Explore Dataflows (Gen2) in Microsoft Fabric
  • Integrate Dataflows (Gen2) and Pipelines in Microsoft Fabric
  • Exercise - Create and use a Dataflow (Gen2) in Microsoft Fabric
  • Knowledge check
  • Summary
MODULE 2: Ingest data with Spark and Microsoft Fabric notebooks
Discover how to use Apache Spark and Python for data ingestion into a Microsoft Fabric lakehouse. Fabric notebooks provide a scalable and systematic solution.
Learning objectives
By the end of this module, you’ll be able to:
  • Ingest external data to Fabric lakehouses using Spark
  • Configure external source authentication and optimization
  • Load data into lakehouse as files or as Delta tables
  • Introduction
  • Connect to data with Spark
  • Write data into a lakehouse
  • Consider uses for ingested data
  • Exercise - Ingest data with Spark and Microsoft Fabric notebooks
  • Knowledge check
  • Summary
MODULE 3: Use Data Factory pipelines in Microsoft Fabric
Microsoft Fabric includes Data Factory capabilities, including the ability to create pipelines that orchestrate data ingestion and transformation tasks.
Learning objectives
In this module, you'll learn how to:
  • Describe pipeline capabilities in Microsoft Fabric
  • Use the Copy Data activity in a pipeline
  • Create pipelines based on predefined templates
  • Run and monitor pipelines
  • Introduction
  • Understand pipelines
  • Use the Copy Data activity
  • Use pipeline templates
  • Run and monitor pipelines
  • Exercise - Ingest data with a pipeline
  • Knowledge check
  • Summary
MODULE 5: Get started with lakehouses in Microsoft Fabric
Lakehouses merge data lake storage flexibility with data warehouse analytics. Microsoft Fabric offers a lakehouse solution for comprehensive analytics on a single SaaS platform.
Learning objectives
In this module, you'll learn how to:
  • Describe core features and capabilities of lakehouses in Microsoft Fabric
  • Create a lakehouse
  • Ingest data into files and tables in a lakehouse
  • Query lakehouse tables with SQL
  • Introduction
  • Explore the Microsoft Fabric Lakehouse
  • Work with Microsoft Fabric Lakehouses
  • Explore and transform data in a lakehouse
  • Exercise - Create and ingest data with a Microsoft Fabric Lakehouse
  • Knowledge check
  • Summary
MODULE 6: Organize a Fabric lakehouse using medallion architecture design
Explore the potential of the medallion architecture design in Microsoft Fabric. Organize and transform your data across Bronze, Silver, and Gold layers of a lakehouse for optimized analytics.
Learning objectives
In this module, you'll learn how to:
  • Describe the principles of using the medallion architecture in data management.
  • Apply the medallion architecture framework within the Microsoft Fabric environment.
  • Analyze data stored in the lakehouse using DirectLake in Power BI.
  • Describe best practices for ensuring the security and governance of data stored in the medallion architecture.
  • Introduction
  • Describe medallion architecture
  • Implement a medallion architecture in Fabric
  • Query and report on data in your Fabric lakehouse
  • Considerations for managing your lakehouse
  • Exercise - Organize your Fabric lakehouse using a medallion architecture
  • Knowledge check
  • Summary
MODULE 7: Use Apache Spark in Microsoft Fabric
Apache Spark is a core technology for large-scale data analytics. Microsoft Fabric provides support for Spark clusters, enabling you to analyze and process data in a Lakehouse at scale.
Learning objectives
In this module, you'll learn how to:
  • Configure Spark in a Microsoft Fabric workspace
  • Identify suitable scenarios for Spark notebooks and Spark jobs
  • Use Spark dataframes to analyze and transform data
  • Use Spark SQL to query data in tables and views
  • Visualize data in a Spark notebook
  • Introduction
  • Prepare to use Apache Spark
  • Run Spark code
  • Work with data in a Spark dataframe
  • Work with data using Spark SQL
  • Visualize data in a Spark notebook
  • Exercise - Analyze data with Apache Spark
  • Knowledge check
  • Summary
MODULE 8: Work with Delta Lake tables in Microsoft Fabric
Tables in a Microsoft Fabric lakehouse are based on the Delta Lake storage format commonly used in Apache Spark. By using the enhanced capabilities of delta tables, you can create advanced analytics solutions.
Learning objectives
In this module, you'll learn how to:
  • Understand Delta Lake and delta tables in Microsoft Fabric
  • Create and manage delta tables using Spark
  • Use Spark to query and transform data in delta tables
  • Use delta tables with Spark structured streaming
  • Introduction
  • Understand Delta Lake
  • Create delta tables
  • Work with delta tables in Spark
  • Use delta tables with streaming data
  • Exercise - Use delta tables in Apache Spark
  • Knowledge check
  • Summary
MODULE 9: Get started with data warehouses in Microsoft Fabric
Data warehouses are analytical stores built on a relational schema to support SQL queries. Microsoft Fabric enables you to create a relational data warehouse in your workspace and integrate it easily with other elements of your end-to-end analytics solution.
Learning objectives
In this module, you'll learn how to:
  • Describe data warehouses in Fabric
  • Understand a data warehouse vs a data Lakehouse
  • Work with data warehouses in Fabric
  • Create and manage datasets within a data warehouse
  • Introduction
  • Understand data warehouse fundamentals
  • Understand data warehouses in Fabric
  • Query and transform data
  • Prepare data for analysis and reporting
  • Secure and monitor your data warehouse
  • Exercise - Analyze data in a data warehouse
  • Knowledge check
  • Summary
MODULE 10: Load data into a Microsoft Fabric data warehouse
Data warehouse in Microsoft Fabric is a comprehensive platform for data and analytics, featuring advanced query processing and full transactional T-SQL capabilities for easy data management and analysis.
Learning objectives
In this module, you'll:
  • Learn different strategies to load data into a data warehouse in Microsoft Fabric.
  • Learn how to build a data pipeline to load a warehouse in Microsoft Fabric.
  • Learn how to load data in a warehouse using T-SQL.
  • Learn how to load and transform data with dataflow (Gen 2).
  • Introduction
  • Explore data load strategies
  • Use data pipelines to load a warehouse
  • Load data using T-SQL
  • Load and transform data with Dataflow Gen2
  • Exercise: Load data into a warehouse in Microsoft Fabric
  • Knowledge check
  • Summary
MODULE 11: Query a data warehouse in Microsoft Fabric
Data warehouse in Microsoft Fabric is a comprehensive platform for data and analytics, featuring advanced query processing and full transactional T-SQL capabilities for easy data management and analysis.
Learning objectives
In this module, you'll:
  • Use SQL query editor to query a data warehouse.
  • Explore how visual query editor works.
  • Learn how to connect and query a data warehouse using SQL Server Management Studio.
  • Introduction
  • Use the SQL query editor
  • Explore the visual query editor
  • Use client tools to query a warehouse
  • Exercise: Query a data warehouse in Microsoft Fabric
  • Knowledge check
  • Summary
MODULE 12: Monitor a Microsoft Fabric data warehouse
A data warehouse is a vital component of an enterprise analytics solution. It's important to learn how to monitor a data warehouse so you can better understand the activity that occurs in it.
Learning objectives
After completing this module, you'll be able to:
  • Monitor capacity unit usage with the Microsoft Fabric Capacity Metrics app.
  • Monitor current activity in the data warehouse with dynamic management views.
  • Monitor querying trends with query insights views.
  • Introduction
  • Monitor capacity metrics
  • Monitor current activity
  • Monitor queries
  • Exercise - Monitor a data warehouse in Microsoft Fabric
  • Knowledge check
  • Summary
MODULE 13: Understand scalability in Power BI
Scalable data models enable enterprise-scale analytics in Power BI. Implement data modeling best practices, use large dataset storage format, and practice building a star schema to design analytics solutions that can scale.
Learning objectives
By the end of this module, you’ll be able to:
  • Describe the importance of building scalable data models
  • Implement Power BI data modeling best practices
  • Use the Power BI large dataset storage format
  • Introduction
  • Describe the significance of scalable models
  • Implement Power BI data modeling best practices
  • Configure large datasets
  • Exercise: Create a star schema model
  • Knowledge check
  • Summary
MODULE 14: Create Power BI model relationships
Power BI model relationships form the basis of a tabular model. Define Power BI model relationships, set up relationships, recognize DAX relationship functions, and describe relationship evaluation.
Learning objectives
By the end of this module, you’ll be able to:
  • Understand how model relationship work.
  • Set up relationships.
  • Use DAX relationship functions.
  • Understand relationship evaluation.
  • Introduction
  • Understand model relationships
  • Set up relationships
  • Use DAX relationship functions
  • Understand relationship evaluation
  • Exercise: Work with model relationships
  • Knowledge check
  • Summary
MODULE 15: Use tools to optimize Power BI performance
Use tools to develop, manage, and optimize Power BI data model and DAX query performance.
Learning objectives
After completing this module, you'll be able to:
  • Optimize queries using performance analyzer.
  • Troubleshoot DAX performance using DAX Studio.
  • Optimize a data model using Tabular Editor.
  • Introduction
  • Use Performance analyzer
  • Troubleshoot DAX performance by using DAX Studio
  • Optimize a data model by using Best Practice Analyzer
  • Exercise: Use tools to optimize Power BI performance
  • Knowledge check
  • Summary
MODULE 16: Enforce Power BI model security
Enforce model security in Power BI using row-level security and object-level security.
Learning objectives
By the end of this module, you’ll be able to:
  • Restrict access to Power BI model data with RLS.
  • Restrict access to Power BI model objects with OLS.
  • Apply good development practices to enforce Power BI model security.
  • Introduction
  • Restrict access to Power BI model data
  • Restrict access to Power BI model objects
  • Apply good modeling practices
  • Exercise: Enforce model security
  • Knowledge check
  • Summary
Read more

Why choose QA

Dates & Locations

Frequently asked questions

See all of our FAQs

How can I create an account on myQA.com?

There are a number of ways to create an account. If you are a self-funder, simply select the "Create account" option on the login page.

If you have been booked onto a course by your company, you will receive a confirmation email. From this email, select "Sign into myQA" and you will be taken to the "Create account" page. Complete all of the details and select "Create account".

If you have the booking number you can also go here and select the "I have a booking number" option. Enter the booking reference and your surname. If the details match, you will be taken to the "Create account" page from where you can enter your details and confirm your account.

Find more answers to frequently asked questions in our FAQs: Bookings & Cancellations page.

How do QA’s virtual classroom courses work?

Our virtual classroom courses allow you to access award-winning classroom training, without leaving your home or office. Our learning professionals are specially trained on how to interact with remote attendees and our remote labs ensure all participants can take part in hands-on exercises wherever they are.

We use the WebEx video conferencing platform by Cisco. Before you book, check that you meet the WebEx system requirements and run a test meeting (more details in the link below) to ensure the software is compatible with your firewall settings. If it doesn’t work, try adjusting your settings or contact your IT department about permitting the website.

Learn more about our Virtual Classrooms.

How do QA’s online courses work?

QA online courses, also commonly known as distance learning courses or elearning courses, take the form of interactive software designed for individual learning, but you will also have access to full support from our subject-matter experts for the duration of your course. When you book a QA online learning course you will receive immediate access to it through our e-learning platform and you can start to learn straight away, from any compatible device. Access to the online learning platform is valid for one year from the booking date.

All courses are built around case studies and presented in an engaging format, which includes storytelling elements, video, audio and humour. Every case study is supported by sample documents and a collection of Knowledge Nuggets that provide more in-depth detail on the wider processes.

Learn more about QA’s online courses.

When will I receive my joining instructions?

Joining instructions for QA courses are sent two weeks prior to the course start date, or immediately if the booking is confirmed within this timeframe. For course bookings made via QA but delivered by a third-party supplier, joining instructions are sent to attendees prior to the training course, but timescales vary depending on each supplier’s terms. Read more FAQs.

When will I receive my certificate?

Certificates of Achievement are issued at the end the course, either as a hard copy or via email. Read more here.

Contact Us

Please contact us for more information