🔥 40% Off Crucial Exams Memberships — Deal ends today!

1 hour, 53 minutes remaining!

GCP Professional Data Engineer Practice Question

Your team operates a streaming Dataflow pipeline that processes sales events 24/7. The job is launched with the flag --worker_zone=us-central1-a. A recent zonal outage in us-central1-a stopped all workers for several hours, breaching the SLA. You must change the launch configuration so the pipeline continues running if any single zone in the region fails, without maintaining a duplicate job or significantly increasing cost. Which action satisfies the requirement?

  • Containerize the pipeline and deploy it to Cloud Run in two regions behind Cloud Load Balancing.

  • Migrate the code to run on an HA Dataproc cluster with masters in three zones and submit the streaming job there.

  • Launch the job with --worker_region=us-central1 and remove the --worker_zone flag so Dataflow distributes workers across healthy zones in the region.

  • Keep the existing job and schedule a second, identical pipeline in us-central1-b; switch Pub/Sub subscriptions to whichever job is healthy.

GCP Professional Data Engineer
Maintaining and automating data workloads
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot