Microsoft Fabric Data Engineer Associate DP-700 Practice Question

You manage a Microsoft Fabric workspace that contains a pipeline which copies incoming files from an Azure Data Lake Storage Gen2 container into a Lakehouse and then calls a notebook for transformation. You must ensure the pipeline starts automatically as soon as a new *.parquet file is written to the container's raw/products folder and that the full path of the file is passed to the notebook as a parameter. Which action should you take?

  • Add a schedule trigger that executes the pipeline every minute and supplies the folder path as a parameter.

  • Create an event trigger connected to the storage account that listens for the BlobCreated event on the raw/products/*.parquet path and maps the blob URL to a pipeline parameter.

  • Configure a manual trigger for the pipeline and enable the On file arrival option on the dataset that points to the folder.

  • Add a tumbling-window trigger that runs every five minutes and passes the detected file path to a pipeline parameter.

Microsoft Fabric Data Engineer Associate DP-700
Implement and manage an analytics solution
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot