Microsoft Fabric Data Engineer Associate DP-700 Practice Question
You have created a notebook in a Microsoft Fabric workspace that accepts two parameters and transforms raw JSON files into a curated Delta table. You must ensure that the notebook runs automatically whenever a new JSON file is written to the raw container of an Azure Data Lake Storage Gen2 account, and only after the file has been copied to OneLake. You want to design the solution by using Fabric features while writing as little custom code as possible. Which approach should you use?
Create a SQL job in the Lakehouse that listens for CREATE FILE events and, when triggered, uses dynamic SQL to call the notebook through the Fabric REST API.
Develop a Dataflow Gen2 that copies data from the raw container to OneLake and adds a script step at the end to invoke the notebook.
Create a Data Factory pipeline that uses an Azure Storage event trigger, adds a Copy activity to move the file to OneLake, and then calls the parameterized notebook in a subsequent Notebook activity.
Configure a scheduled run for the notebook that executes every five minutes and add code to the notebook to poll the raw container for new files and copy them before processing.
Event-based automation in Microsoft Fabric is available through Data Factory pipelines, not through notebook schedules. A pipeline can be configured with a storage event trigger that fires when a blob is created in a specified ADLS Gen2 path. Inside the pipeline you add a Copy activity to move the blob into OneLake, and then a Notebook activity that executes the existing parameterized notebook. Notebook schedules can only run on a fixed time basis and cannot listen for storage events, while SQL jobs and dataflows cannot orchestrate notebook execution after the copy finishes. Therefore, using a pipeline with an event trigger best meets the requirements and minimises custom code.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a storage event trigger in Data Factory?
Open an interactive chat with Bash
What is OneLake in Microsoft Fabric?
Open an interactive chat with Bash
How does a Notebook activity work in a Data Factory pipeline?
Open an interactive chat with Bash
Microsoft Fabric Data Engineer Associate DP-700
Implement and manage an analytics solution
Your Score:
Report Issue
Bash, the Crucial Exams Chat Bot
AI Bot
Loading...
Loading...
Loading...
Pass with Confidence.
IT & Cybersecurity Package
You have hit the limits of our free tier, become a Premium Member today for unlimited access.
Military, Healthcare worker, Gov. employee or Teacher? See if you qualify for a Community Discount.
Monthly
$19.99
$19.99/mo
Billed monthly, Cancel any time.
3 Month Pass
$44.99
$14.99/mo
One time purchase of $44.99, Does not auto-renew.
MOST POPULAR
Annual Pass
$119.99
$9.99/mo
One time purchase of $119.99, Does not auto-renew.
BEST DEAL
Lifetime Pass
$189.99
One time purchase, Good for life.
What You Get
All IT & Cybersecurity Package plans include the following perks and exams .