GCP Professional Data Engineer Practice Question

A fintech startup runs a mission-critical PostgreSQL 11 database on-premises that processes roughly 3,000 OLTP transactions per second and stores 3 TB of data. Compliance rules demand fully ACID transactions, and the leadership wants a managed service with at least a 99.95 % availability SLA in one Google Cloud region. Analysts also need near-real-time read access from a second region, and they can tolerate a few seconds of replication lag. The workload is expected to remain below 5 TB and 10,000 TPS for the next three years. Which Google Cloud design best meets these requirements while minimizing operational overhead?

  • Create a Cloud SQL for PostgreSQL instance with high-availability (regional) configuration in the primary region and add an asynchronous cross-region read replica for the analysts.

  • Migrate to a multi-region Cloud Spanner instance to provide global strong consistency for both transactional and analytic workloads.

  • Load the data into BigQuery and use BI Engine to serve both transactional and analytic queries from a multi-regional dataset.

  • Deploy PostgreSQL in a GKE StatefulSet spread across zones and use logical replication to another regional GKE cluster for reads.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot