Microsoft Fabric Data Engineer Associate DP-700 Practice Question

You manage a Microsoft Fabric workspace that includes a Lakehouse for enterprise reporting. IoT devices publish JSON-encoded telemetry to an Azure Event Hubs namespace at about 5,000 events per second. You need a streaming ingestion pattern that

  • lands raw events in near real time to a Bronze Delta table in the Lakehouse,
  • automatically adapts to evolving JSON schemas without manual column mapping, and
  • guarantees at-least-once delivery even after service restarts. Which Fabric component should you use?
  • Ingest the data into a Real-Time Analytics KQL database and configure continuous export to a Lakehouse Delta table.

  • Enable mirroring on the Event Hubs namespace and surface the mirrored data as a Lakehouse table.

  • Create an Eventstream that reads from the Event Hubs namespace and writes directly to a Delta table in the Lakehouse.

  • Build a Fabric pipeline with a Copy activity that polls Event Hubs every minute and writes to the Lakehouse.

Microsoft Fabric Data Engineer Associate DP-700
Ingest and transform data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot