🔥 40% Off Crucial Exams Memberships — This Week Only

3 days, 9 hours remaining!

Microsoft Fabric Data Engineer Associate DP-700 Practice Question

Your Fabric Lakehouse ingests IoT telemetry into the DeviceReadings table every minute through an ingestion pipeline. Finance must receive a daily CSV summary after the previous day's load completes, saved to fin_exports/YYYY/MM/dd/. The job must start automatically, retry once if it fails, and demand minimal custom code. You have written a notebook that performs the aggregation and export. Which Fabric feature should you use to operationalize this requirement?

  • Publish the notebook as a Spark job definition and invoke it daily from an Azure Logic App.

  • Create a lakehouse shortcut to DeviceReadings and use a Power BI incremental refresh to export the CSV.

  • Create a Data Factory pipeline with a Notebook activity, add an event-based trigger that starts after the ingestion pipeline succeeds, and set one retry.

  • Attach a daily schedule directly to the notebook and rely on the notebook's built-in retry setting.

Microsoft Fabric Data Engineer Associate DP-700
Implement and manage an analytics solution
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot