Microsoft Fabric Data Engineer Associate DP-700 Practice Question

You must automate ingestion and cleansing of JSON files from an Azure Storage container into a Lakehouse table in Microsoft Fabric. The process will run hourly, execute PySpark code for data quality checks, and post a Microsoft Teams alert only if the PySpark step fails. Business analysts with limited coding skills need to adjust the trigger schedule without editing code. Which Fabric item should you build to implement the orchestration logic?

  • Schedule a single Spark notebook that performs the copy, transformation, and Teams notification entirely in code.

  • Configure a Spark job definition and rely on the job scheduler to run the notebook and send alerts.

  • Create a pipeline in the workspace and add Copy, Notebook, and Teams activities with an hourly trigger.

  • Build the solution as a Dataflow Gen2 with incremental refresh and an hourly refresh schedule.

Microsoft Fabric Data Engineer Associate DP-700
Implement and manage an analytics solution
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot