DP-700: Microsoft Fabric Data Engineer

$2,950.00

Microsoft Fabric Data Engineer

This 4-day course provides practical training on implementing data engineering solutions using Microsoft Fabric. Microsoft Fabric Data Engineer Participants will explore how to design and build scalable data architectures, implement efficient data loading patterns, and orchestrate end-to-end workflows. Core focus areas include ingesting and transforming data, along with securing, managing, and monitoring Fabric-based data engineering environments.

Outcome

By the end of the course, learners will be able to develop and deploy enterprise-ready data engineering solutions within Microsoft Fabric. They will gain proficiency in orchestrating complex data pipelines, managing storage and compute layers, and applying governance and security best practices.

Audience Profile

This course is intended for data professionals involved in data extraction, transformation, loading, and orchestration. It is ideal for individuals responsible for building scalable data engineering solutions within enterprise analytics environments using Microsoft Fabric.

Prerequisites

Participants should have:

  • Experience in data integration and orchestration, ideally with DP-203: Azure Data Engineer certification or equivalent skills
  • Hands-on proficiency in at least one programming/query language:
      • SQL
      • PySpark
      • Kusto Query Language (KQL)

On-site Training?

If you need training for three or more people, ask us about training at your site. You can enjoy the convenience of reduced travel cost and time, as well as a familiar environment for your staff. Additionally, we can customise the course for your business needs.

Cancellation Policy

To cancel or reschedule, please contact us 10 days before the course.

Contact Details

0410077106
fusman@technisaur.com.au
Melbourne VIC, Australia

Modules

Module 1: Ingest Data with Dataflows Gen2 in Microsoft Fabric

Data ingestion is a vital step in any analytics process. In Microsoft Fabric, Data Factory provides Dataflows Gen2, which enable you to visually design multi-step data ingestion and transformation pipelines using Power Query Online.

Learning objectives
By the end of this module, you’ll be able to:

  • Describe the capabilities of Dataflows in Microsoft Fabric.
  • Build Dataflow solutions to ingest and transform data.
  • Integrate a Dataflow into a pipeline.

Module 2: Orchestrate processes and data movement with Microsoft Fabric

Microsoft Fabric extends its Data Factory capabilities to allow you to design and manage pipelines that orchestrate both data ingestion and transformation tasks across your environment.

Learning objectives
In this module, you’ll learn how to:

  • Explain pipeline capabilities in Microsoft Fabric.
  • Use the Copy Data activity within a pipeline.
  • Create pipelines from predefined templates.
  • Run and monitor pipelines effectively.

Module 3: Use Apache Spark in Microsoft Fabric

Apache Spark is a powerful framework for large-scale data processing and analytics. Microsoft Fabric supports Spark clusters, giving you the ability to analyse and transform data stored in a Lakehouse at scale.

Learning objectives
By completing this module, you’ll be able to:

  • Configure Spark in a Microsoft Fabric workspace.
  • Identify appropriate scenarios for using Spark notebooks and Spark jobs.
  • Use Spark dataframes for analysis and transformation.
  • Query tables and views with Spark SQL.
  • Visualise data within a Spark notebook.

Module 4: Get started with Real-Time Intelligence in Microsoft Fabric

Modern data solutions rely heavily on the ability to process and act on real-time data. With Microsoft Fabric’s Real-Time Intelligence (RTI), you can ingest, query, and analyse streams of data as they happen.

Learning objectives
In this module, you’ll explore how to:

  • Capture and ingest real-time event data.
  • Query and process data streams in Fabric.
  • Use real-time insights to support timely decision-making.

Module 5: Use real-time eventstreams in Microsoft Fabric

Eventstreams in Microsoft Fabric, part of its Real-Time Intelligence capability, provide a straightforward way to capture, transform, and route live data for analytics or operational needs.

Learning objectives
After completing this module, you’ll be able to:

  • Configure sources and destinations in Microsoft Fabric Eventstreams.
  • Capture, transform, and direct event data through Eventstreams.

Module 6: Work with real-time data in a Microsoft Fabric eventhouse

An eventhouse in Microsoft Fabric provides a scalable and high-performance environment for storing and querying real-time data. With built-in Kusto Query Language (KQL) support, you can interrogate fast-moving datasets with ease while also deriving persistent insights through materialised views.

Learning objectives
By the end of this module, you’ll be able to:

  • Create an eventhouse in Microsoft Fabric.
  • Query real-time data using Kusto Query Language (KQL).
  • Build materialised views and stored functions within a KQL database.

Module 7: Introduction to end-to-end analytics using Microsoft Fabric

Microsoft Fabric delivers an integrated analytics platform that unifies ingestion, storage, processing, and visualisation. This module introduces the full analytics lifecycle within Fabric and positions its core components within a cohesive architectural flow.

Learning objectives
In this module, you’ll learn how to:

  • Describe how Microsoft Fabric enables end-to-end analytics across the data lifecycle.

Module 8: Get started with lakehouses in Microsoft Fabric

Lakehouses provide the flexibility of data lakes with the structure and performance of data warehouses. In Microsoft Fabric, lakehouses act as central analytical stores capable of housing both files and tables.

Learning objectives
After completing this module, you’ll be able to:

  • Explain the key features and capabilities of lakehouses in Fabric.
  • Create a lakehouse environment.
  • Ingest data into files and tables within a lakehouse.
  • Query lakehouse tables using SQL.

Module 9: Use Apache Spark in Microsoft Fabric

Apache Spark underpins large-scale processing within Fabric, enabling both batch and interactive workloads. This module builds on earlier concepts with a deeper focus on applying Spark for data engineering and advanced analytics.

Learning objectives
In this module, you’ll learn how to:

  • Configure Spark within a Fabric workspace.
  • Select appropriate scenarios for Spark notebooks and jobs.
  • Manipulate data using Spark dataframes.
  • Query structured data with Spark SQL.
  • Visualise your findings directly within a Spark notebook.

Module 10: Work with Delta Lake tables in Microsoft Fabric

Delta Lake tables provide ACID-compliant storage with support for versioning, time travel, and scalable streaming integration. Microsoft Fabric leverages Delta tables to power reliable and high-performance analytics in lakehouses.

Learning objectives
By the end of this module, you’ll be able to:

  • Explain the concept of Delta Lake and its role within Fabric.
  • Create and manage Delta tables using Spark.
  • Optimise Delta tables for performance and reliability.
  • Query and transform Delta tables using Spark.
  • Use Delta tables with Spark structured streaming.

Module 11: Ingest Data with Dataflows Gen2 in Microsoft Fabric

Dataflows Gen2 provides a visual, low-code interface for building repeatable data ingestion and transformation logic using Power Query Online. By using Dataflows within Microsoft Fabric, you can deliver structured data pipelines without needing to write code.

Learning objectives
In this module, you’ll learn how to:

  • Describe the capabilities of Dataflows in Microsoft Fabric.
  • Build Dataflow solutions to ingest and transform data.
  • Embed a Dataflow within a pipeline for operational execution.

Module 12: Orchestrate processes and data movement with Microsoft Fabric

Beyond ingestion, data engineering often requires coordinating activities across systems. Microsoft Fabric pipelines provide orchestration capabilities for managing dependencies, scheduling workloads, and tracking execution.

Learning objectives
By completing this module, you’ll be able to:

  • Describe pipeline capabilities within Microsoft Fabric.
  • Implement the Copy Data activity in a pipeline.
  • Create pipelines using ready-made templates.
  • Execute and monitor pipeline runs.

Module 13: Organise a Fabric lakehouse using medallion architecture design

The medallion architecture (Bronze, Silver, Gold layers) promotes structured refinement of data for analytics. This module explores how Fabric lakehouses can be organised using this approach to ensure clarity, performance, and governance.

Learning objectives
In this module, you’ll learn how to:

  • Explain the core principles of medallion architecture.
  • Apply Bronze, Silver, and Gold layering in a Fabric lakehouse.
  • Analyse data using Power BI DirectLake.
  • Implement best practices for securing and governing medallion-layered data.

Module 14: Get started with Real-Time Intelligence in Microsoft Fabric

Microsoft Fabric’s Real-Time Intelligence capabilities enable continuous ingestion and processing of live event data. Whether monitoring devices, applications, or business processes, you can gain insights as events unfold.

Learning objectives
By completing this module, you’ll be able to:

  • Capture and process live data streams using Real-Time Intelligence features.
  • Visualise real-time insights for operational decision-making.
  • Trigger actions based on streaming conditions.

Module 15: Use real-time eventstreams in Microsoft Fabric

Eventstreams allow you to define pipelines specifically for real-time data. They support ingestion from event sources such as IoT devices or message hubs, transforming and routing them for immediate consumption.

Learning objectives
In this module, you’ll learn how to:

  • Configure sources and destinations within Microsoft Fabric Eventstreams.
  • Capture, transform, and distribute event data in motion.

Module 16: Work with real-time data in a Microsoft Fabric eventhouse

An eventhouse acts as a dedicated analytical store for real-time and time-series data. It enables high-speed querying, long-term retention, and seamless integration with streaming sources — making it ideal for operational intelligence scenarios.

Learning objectives
After completing this module, you’ll be able to:

  • Create an eventhouse in Microsoft Fabric.
  • Query real-time data using Kusto Query Language (KQL).
  • Build materialised views and stored functions within a KQL database.

Module 17: Create Real-Time Dashboards with Microsoft Fabric

Real-Time Dashboards allow you to instantly surface live insights from continuous data streams. These dashboards provide responsive visualisations for operational monitoring and decision-making.

Learning objectives
By the end of this module, you’ll be able to:

  • Build a real-time dashboard in Microsoft Fabric.
  • Use advanced dashboard features for interactivity and drill-through.
  • Apply best practices for performance and usability in real-time reporting.

Module 18: Introduction to end-to-end analytics using Microsoft Fabric

Microsoft Fabric brings together ingestion, storage, transformation, and visualisation into a single analytics platform. This module revisits the full lifecycle from a strategic perspective, reinforcing how Fabric components interact as a cohesive solution.

Learning objectives
In this module, you’ll learn how to:

  • Describe how Microsoft Fabric supports end-to-end analytics across the data ecosystem.

Module 19: Get started with data warehouses in Microsoft Fabric

Fabric data warehouses provide structured relational storage optimised for SQL-based analytics. They integrate closely with other Fabric components, enabling hybrid solutions alongside lakehouses and real-time systems.

Learning objectives
After completing this module, you’ll be able to:

  • Describe the role and capabilities of data warehouses in Fabric.
  • Differentiate between a data warehouse and a lakehouse.
  • Create and manage data warehouses in Fabric.
  • Build and maintain fact and dimension tables.

Module 20: Load data into a Microsoft Fabric data warehouse

Loading data efficiently is critical to warehouse performance. Fabric provides multiple ways to populate warehouses, from T-SQL operations to pipelines and Dataflow (Gen2) transformations.

Learning objectives
In this module, you’ll learn how to:

  • Apply different strategies for loading data into a Fabric data warehouse.
  • Build data pipelines to automate warehouse ingestion.
  • Load data using T-SQL commands.
  • Ingest and transform data with Dataflow Gen2.

Module 21: Query a data warehouse in Microsoft Fabric

Once data is loaded into a Fabric data warehouse, the next step is exploration and analysis. This module focuses on querying techniques using both built-in tools and external clients.

Learning objectives
By the end of this module, you’ll be able to:

  • Use the SQL query editor to explore data within a warehouse.
  • Understand how the visual query editor operates.
  • Connect and query a warehouse using SQL Server Management Studio (SSMS).

Module 22: Monitor a Microsoft Fabric data warehouse

Visibility into warehouse activity is essential for optimisation and governance. Microsoft Fabric provides monitoring tools to track performance, capacity, and user workloads.

Learning objectives
In this module, you’ll learn how to:

  • Monitor capacity usage with the Fabric Capacity Metrics app.
  • Inspect current activity using dynamic management views.
  • Analyse query behaviour with query insights reports.

Module 23: Secure a Microsoft Fabric data warehouse

Security is fundamental to any enterprise analytics solution. Fabric provides multiple layers of protection, enabling fine-grained control over data visibility and access.

Learning objectives
After completing this module, you’ll be able to:

  • Understand the principles of securing a Fabric data warehouse.
  • Apply dynamic data masking to hide sensitive information.
  • Configure row-level security for granular access control.
  • Implement column-level security to protect critical attributes.
  • Grant permissions using T-SQL-based role assignments.

Module 24: Implement continuous integration and continuous delivery (CI/CD) in Microsoft Fabric

Fabric supports modern development practices through Git integration and deployment pipelines. These features streamline collaboration and automate the delivery of analytics content.

Learning objectives
In this module, you’ll learn how to:

  • Define CI/CD in the context of Microsoft Fabric.
  • Implement source control using Git integration.
  • Use deployment pipelines to promote changes across environments.
  • Automate CI/CD processes using Fabric APIs.

Module 25: Monitor activities in Microsoft Fabric

Operational monitoring helps ensure that analytics processes remain healthy and responsive. Fabric provides centralised tools for tracking activity and reacting to defined conditions.

Learning objectives
By the end of this module, you’ll be able to:

  • Apply monitoring principles to Microsoft Fabric workloads.
  • Use Monitoring Hub to observe system-wide activity.
  • Trigger automated actions using Fabric Activator.

Module 26: Secure data access in Microsoft Fabric

Fabric applies security at multiple layers — from workspace-level permissions to item-specific access. This module covers best practices for applying controlled access models.

Learning objectives
In this module, you’ll learn how to:

  • Describe the Fabric permissions model.
  • Configure permissions at workspace and item level.
  • Apply granular access controls across Fabric assets.

Module 27: Administer a Microsoft Fabric environment

Administrators play a key role in shaping governance, security, and operational efficiency. Fabric provides a dedicated admin centre for managing capacity, access, and policies.

Learning objectives
After completing this module, you’ll be able to:

  • Describe core Fabric administration responsibilities.
  • Navigate and configure the Fabric admin centre.
  • Manage user access and roles.
  • Govern data and resources across the Fabric environment.

Reviews

There are no reviews yet.

Be the first to review “DP-700: Microsoft Fabric Data Engineer”

five − two =

A 4-day hands-on course for data engineers to design, build, and orchestrate data engineering solutions using Microsoft Fabric.

  • From 9 am - 5 pm