GCP Professional Data Engineer Practice Question

Your team runs a nightly Dataflow ETL job that reads transactional data from a Cloud SQL for PostgreSQL instance and writes aggregates back into the same database. Last month a zonal outage in the instance's zone caused the job to fail and delayed reporting. You must harden the database against a single-zone failure while minimizing application changes and operational overhead. Which approach best meets the requirement?

  • Create a cross-region read replica and direct Dataflow to use the replica; Cloud SQL will automatically promote it and redirect traffic during a failure.

  • Provision a second single-zone Cloud SQL instance in a different zone, configure logical replication, and switch the job's connection string to this replica during an outage.

  • Export the database to Cloud Storage each night and have the Dataflow job read from the export if the primary instance is unavailable.

  • Convert the database to a Cloud SQL regional high-availability configuration so a synchronous standby in another zone can take over automatically while keeping the same private IP.

GCP Professional Data Engineer
Maintaining and automating data workloads
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot