Your company is building a nightly Cloud Composer workflow that must (1) launch a Dataflow Flex Template job to load CSV files from Cloud Storage into a staging table, (2) execute a BigQuery SQL transformation only when the load succeeds, and (3) send an alert and prevent all downstream tasks from running if the load fails, without requiring manual cleanup before the next scheduled run. Which DAG design best satisfies these requirements?
Insert a BranchPythonOperator after the Dataflow task that pushes XCom to decide whether to run the BigQuery task; schedule a manual DAG run if the branch chooses the failure path.
Wrap the three steps inside a SubDagOperator and set the SubDAG's trigger_rule to "all_success"; configure no additional callbacks and clear failed tasks manually before the next run.
Define a DataflowTemplateOperator followed by a BigQueryInsertJobOperator; set trigger_rule="all_success" on the BigQuery task, add an on_failure_callback to the Dataflow task that calls the PagerDuty API, and leave default scheduling so the next DAG run starts normally.
Use trigger_rule="all_done" on the BigQueryInsertJobOperator so it always executes; rely on an SLA miss notification to detect failures in the Dataflow task.
Using Cloud Composer (Apache Airflow), the most reliable way to enforce "run-only-on-success" semantics is to chain the Dataflow load task directly to the BigQuery transformation task and set the transformation's trigger_rule to "all_success". If the DataflowTemplateOperator fails, the trigger rule prevents the BigQueryInsertJobOperator from executing. Attaching an on_failure_callback to the Dataflow task lets you invoke a PagerDuty (or comparable) alert. Because the DAG run finishes in a failed state and tasks are not left in "upstream_failed", the next scheduled run starts cleanly without manual intervention. Branching or using trigger_rule="all_done" would still allow downstream tasks to run after a failure, and wrapping the logic in a SubDagOperator does not by itself satisfy the skip-on-failure and auto-alert requirements.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is the role of trigger_rule in an Apache Airflow DAG?
Open an interactive chat with Bash
What is a Dataflow Flex Template in GCP?
Open an interactive chat with Bash
How does on_failure_callback work in Apache Airflow?
Open an interactive chat with Bash
What is a Dataflow Flex Template?
Open an interactive chat with Bash
How does trigger_rule='all_success' work in Apache Airflow?
Open an interactive chat with Bash
What is an on_failure_callback in Airflow and how does it work?
Open an interactive chat with Bash
GCP Professional Data Engineer
Ingesting and processing the data
Your Score:
Report Issue
Bash, the Crucial Exams Chat Bot
AI Bot
Loading...
Loading...
Loading...
Pass with Confidence.
IT & Cybersecurity Package
You have hit the limits of our free tier, become a Premium Member today for unlimited access.
Military, Healthcare worker, Gov. employee or Teacher? See if you qualify for a Community Discount.
Monthly
$19.99
$19.99/mo
Billed monthly, Cancel any time.
3 Month Pass
$44.99
$14.99/mo
One time purchase of $44.99, Does not auto-renew.
MOST POPULAR
Annual Pass
$119.99
$9.99/mo
One time purchase of $119.99, Does not auto-renew.
BEST DEAL
Lifetime Pass
$189.99
One time purchase, Good for life.
What You Get
All IT & Cybersecurity Package plans include the following perks and exams .