GCP Professional Data Engineer Practice Question

Your company runs a Cloud Composer environment (Apache Airflow 2). A BigQueryOperator task occasionally returns transient 500 errors that are cleared within a few minutes. Operations wants the pipeline to recover automatically by retrying the task up to three times, waiting five minutes between attempts, and increasing the waiting period for each subsequent failure to reduce load on BigQuery. Which set of Airflow task arguments in the Python task definition best meets these requirements?

  • Set retries=0 and retry_delay=timedelta(minutes=5) so the task fails fast and the DAG run can restart automatically.

  • Enable depends_on_past=True, limit max_active_runs to 1, and leave the default single retry.

  • Set retries=3, retry_delay=timedelta(minutes=5), and retry_exponential_backoff=True on the task.

  • Configure an on_failure_callback that triggers a new DAG run and set trigger_rule=all_failed instead of using retries.

GCP Professional Data Engineer
Maintaining and automating data workloads
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot