Microsoft Fabric Data Engineer Associate DP-700 Practice Question

You ingest a daily parquet file of customer data into a Delta table named customer_raw in a Microsoft Fabric lakehouse. You must populate a DimCustomer table that tracks history using a Type 2 slowly changing dimension (SCD) while avoiding full-table reloads. Which transformation pattern should you use to meet the requirements?

  • Execute a PySpark notebook that performs a Delta MERGE between customer_raw and DimCustomer to insert new customers and version changed records with new surrogate keys.

  • Configure a Dataflow Gen2 incremental refresh that truncates DimCustomer and appends the current day's customer records.

  • Run a T-SQL CREATE TABLE AS SELECT (CTAS) statement each day to overwrite DimCustomer with the latest snapshot from customer_raw.

  • Create a KQL materialized view over customer_raw and expose it directly as the DimCustomer dimension.

Microsoft Fabric Data Engineer Associate DP-700
Ingest and transform data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot