GCP Professional Data Engineer Practice Question

Your Cloud Composer DAG has three sequential tasks: create a Dataproc cluster, run a PySpark job on it, and delete the cluster. In testing, when the PySpark job fails, the deletion step is skipped, leaving the cluster running and incurring cost. To ensure the DataprocDeleteClusterOperator always runs after the PySpark task finishes-regardless of success, failure, or skip-which Airflow trigger_rule should you set on the delete-cluster task?

  • all_failed

  • one_success

  • all_success

  • all_done

GCP Professional Data Engineer
Maintaining and automating data workloads
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot