DP-700 Study Guide (Implementing Data Engineering Solutions Using Microsoft Fabric)

DP-700 Study Guide

DP-700 Preparation Details

Preparing for the DP-700 Implementing Data Engineering Solutions Using Microsoft Fabric exam? Don’t know where to start? This post is the DP-700 Certificate Study Guide (with links to each exam objective).

I have curated a list of articles from Microsoft documentation for each objective of the DP-700 exam. Please share the post within your circles so it helps them to prepare for the exam.

Exam Voucher for DP-700 with 1 Retake

Get 40% OFF with the combo

DP-700 Azure Administrator Practice Test

Udemy Practice TestsMicrosoft Fabric Data Engineer Associate tests
Amazon e-book (PDF) Azure Administrator Practice Test
WhizlabsMicrosoft Certified Fabric Data Engineer Associate

DP-700 Azure Administrator Online Course

UdemyMicrosoft prep: Fabric Data Engineer Associate

Implement and manage an analytics solution (30–35%)

Configure Microsoft Fabric workspace settings

Configure Spark workspace settings

Workspace administration settings in Microsoft Fabric

Workspace Settings in Microsoft Fabric

Configure domain workspace settings

Domains – Microsoft Fabric | Microsoft Learn

Domain management tenant settings

Configure OneLake workspace settings

OneLake tenant settings – Microsoft Fabric

Configure data workflow workspace settings

Create and use Dataflows (Gen2) in Microsoft Fabric

Implement lifecycle management in Fabric

Configure version control

Overview of Fabric Git integration

Implement version control and Git integration

Implement database projects

Database Projects for Data Warehouses

SQL database tutorial – Introduction

Create and configure deployment pipelines

Overview of Fabric deployment pipelines

Get started using deployment pipelines

Configure security and governance

Implement workspace-level access controls

Capacities, Workspaces, and Access Control in Microsoft Fabric

Give users access to workspaces – Microsoft Fabric

Roles in workspaces in Microsoft Fabric

Implement item-level access controls

Permission model – Microsoft Fabric

Item permissions for Security & Governance in Microsoft Fabric

Implement row-level, column-level, object-level, and file-level access controls

Row-level security in Fabric data warehousing

Implement row-level security in Microsoft Fabric data warehousing

Column-level security in Fabric data warehousing

Implement column-level security in Fabric data warehousing

Security for data warehousing

Object Level Permissions for Security & Governance in Fabric

Implement dynamic data masking

Dynamic data masking in Fabric Data Warehouse

How to implement dynamic data masking in Fabric Data Warehouse

Apply sensitivity labels to items

Apply sensitivity labels to Fabric items

Enable sensitivity labels in Fabric and Power BI

Endorse items

Endorsement overview

Endorse Fabric and Power BI items

Orchestrate processes

Choose between a pipeline and a notebook

Data Pipeline vs Notebook in Microsoft Fabric

Performance issues running pipelines and notebooks

Design and implement schedules and event-based triggers

Scheduling & Monitoring Data Pipelines

Implement orchestration patterns with notebooks and pipelines, including parameters and dynamic expressions

Use parameters in Real-Time Dashboards

Parameters – Microsoft Fabric

Microsoft Fabric Data Pipeline Expression

Expressions and functions

Ingest and transform data (30–35%)

Design and implement loading patterns

Design and implement full and incremental data loads

Explore data load strategies

Incrementally load data from the Data Warehouse to Lakehouse

Prepare data for loading into a dimensional model

Dimensional modeling in Microsoft Fabric Warehouse

Load tables in a dimensional model

Design and implement a loading pattern for streaming data

Get and transform events from streaming sources

Streaming data into Lakehouse

Ingest and transform batch data

Choose an appropriate data store

Ingesting data into the warehouse

Choose between dataflows, notebooks, and T-SQL for data transformation

Solved: When to use the T-SQL notebook

Transforming data | PySpark, T-SQL & Dataflows in Microsoft Fabric

Create and manage shortcuts to data

Unify data sources with OneLake shortcuts

Create a OneLake shortcut – Microsoft Fabric

Implement mirroring

Mirroring – Microsoft Fabric

Configure Microsoft Fabric Mirrored Databases From Azure SQL Database

Ingest data by using pipelines

Ingest data into your Warehouse using data pipelines

Transform data by using PySpark, SQL, and KQL

Transform data with Apache Spark and query with SQL

Using Fabric notebooks (pySpark) to clean and transform real-world JSON data

Mastering End-To-End KQL Transformations In Microsoft Fabric

Transform data in a KQL Database

Denormalize data

Denormalizing a Snowflake Schema into a Star Schema

Optimize Data Model by using Denormalization

Group and aggregate data

Create GroupBy Aggregation using Fabric Data Warehouse

Aggregate transformation in mapping data flow

Handle duplicate, missing, and late-arriving data

Handle missing data

Latency and accuracy considerations in Activator rules

Ingest and transform streaming data

Choose an appropriate streaming engine

Processing stream data with Microsoft Fabric Event Streams

Process data by using eventstreams

Microsoft Fabric event streams overview

Create an eventstream in Microsoft Fabric

Process data by using Spark structured streaming

Streaming data into lakehouse

Get started with streaming data in the lakehouse

Process data by using KQL

Microsoft Fabric Real-Time Analytics with KQL

Use KQL effectively

Create windowing functions

Window Functions in Fabric Eventstream

Window functions – Kusto

Monitor and optimize an analytics solution (30–35%)

Monitor Fabric items

Monitor data ingestion

Ingestion results logs

Monitor activities in Microsoft Fabric

Monitor data transformation

How to monitor pipeline runs

Monitoring the status and performance of an Eventstream item

Monitor semantic model refresh

Semantic model refresh activity in Data Factory for Microsoft Fabric

Four ways to monitor a semantic model refresh in Fabric

Configure alerts

Set alerts based on Fabric events in Real-Time hub

Create Activator alerts from a Real-Time Dashboard

Identify and resolve errors

Identify and resolve pipeline errors

Pipeline failure and error message

Identify and resolve dataflow errors

Solved: Dataflow Gen 2 fail error message

Error handling in dataflow

Dataflow (gen2) error: Couldn’t refresh

Identify and resolve notebook errors

Monitoring Spark (errors and cost) in Microsoft Fabric

Identify and resolve eventhouse errors

Manage and monitor an Eventhouse

Identify and resolve eventstream errors

Monitoring the status and performance of an Eventstream item

Identify and resolve T-SQL errors

Fix SQL Analytics Endpoint Sync Issues in Microsoft Fabric

Optimize performance

Optimize a lakehouse table

Delta Lake table optimization and V-Order

Delta table maintenance in Microsoft Fabric

Optimize a pipeline

Designing and Implementing Dynamic Data Pipelines

Optimize a data warehouse

Warehouse performance guidelines

Optimize eventstreams and eventhouses

Eventhouse monitoring overview

Optimize Spark performance

10x Spark performance improvement in Microsoft Fabric

Optimize query performance

Query insights – Microsoft Fabric

This brings us to the end of the DP-700 Implementing Data Engineering Solutions Using Microsoft Fabric Study Guide.

What do you think? Let me know in the comments section if I have missed out on anything. Also, I love to hear from you about how your preparation is going on!

In case you are preparing for other Azure certification exams, check out the Azure study guide for those exams.

Follow Me to Receive Updates on the DP-700 Exam


Want to be notified as soon as I post? Subscribe to the RSS feed / leave your email address in the subscribe section. Share the article to your social networks with the below links so it can benefit others.

Share the DP-700 Study Guide in Your Network

You may also like

Leave a Reply

Your email address will not be published. Required fields are marked *