Microsoft Fabric Data Engineer Associate DP-700 Practice Question

You manage a Microsoft Fabric lakehouse that must ingest a real-time feed of 150 000 JSON events per second from Azure Event Hubs. The solution has the following requirements:

  • Land the events in Delta tables in the lakehouse bronze folder with sub-second latency.
  • Guarantee exactly-once delivery even if the ingestion job is retried or restarted. Which ingestion approach should you implement to meet the requirements?
  • Build a data pipeline with a Copy activity that continuously copies data from Event Hubs to the lakehouse.

  • Create an Eventstream in Real-Time Intelligence and configure it to write directly to the lakehouse.

  • Enable continuous data import in a KQL database and expose the data to the lakehouse by creating a shortcut.

  • Develop a Spark Structured Streaming notebook that reads from Azure Event Hubs and writes to Delta tables in the lakehouse by using a checkpoint location.

Microsoft Fabric Data Engineer Associate DP-700
Ingest and transform data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot