Microsoft Fabric Data Engineer Associate DP-700 Practice Question

You are designing a streaming data load into a bronze Delta table in a Microsoft Fabric lakehouse. The data arrives at high volume in Azure Event Hubs, and downstream teams must see each event once and only once in the table, even if a network failure forces the streaming job to restart. Which approach should you implement to meet the requirement with minimal custom infrastructure?

  • Schedule a pipeline copy activity that writes Event Hubs data to a temporary blob container and loads it into the Delta table hourly.

  • Create a Real-Time Intelligence Eventstream that routes events to a KQL database connected to the lakehouse.

  • Use a Fabric notebook with Spark Structured Streaming to read from Event Hubs and write to the Delta table in append mode, storing checkpoints in the lakehouse.

  • Configure a Dataflow Gen2 with incremental refresh to pull data from Event Hubs into the lakehouse every minute.

Microsoft Fabric Data Engineer Associate DP-700
Ingest and transform data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot