GCP Professional Data Engineer Practice Question

Your company ingests IoT events into a Pub/Sub topic, processes them with a streaming Dataflow job running in us-central1, and writes results to a BigQuery dataset in the same region. New disaster-recovery objectives require an RTO of 30 minutes if us-central1 is lost, an RPO of 5 minutes, and minimal ongoing operational effort. Which design change best meets these goals?

  • Create BigQuery table snapshots every 5 minutes and replicate them to Cloud SQL in us-east1 using Database Migration Service; reconfigure dashboards to query Cloud SQL if us-central1 fails.

  • Schedule BigQuery copy jobs every 5 minutes to replicate tables to a dataset in us-east1 and manually switch dashboards to the replica during an outage.

  • Convert the BigQuery dataset to the US multi-region location, retain Pub/Sub for durable message backlog, and deploy the streaming pipeline as a Dataflow Flex Template so it can be quickly launched in another region when a regional outage is detected.

  • Export processed data from BigQuery to Cloud Storage every 5 minutes and replicate the bucket to us-east1 with Cloud Storage Transfer Service, then load the files into BigQuery on failover.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot