🔥 40% Off Crucial Exams Memberships — Deal ends today!

1 hour, 26 minutes remaining!

GCP Professional Data Engineer Practice Question

You are planning the migration of a 40-TB Oracle 11g on-premises transactional database to Cloud SQL for PostgreSQL. The application is revenue-critical and may experience no more than 20 minutes of downtime at cut-over. Business owners also mandate a 15-minute recovery-point objective (RPO) and a 2-hour recovery-time objective (RTO). A Dedicated Interconnect circuit to Google Cloud is already in place. After migration, the data engineering team must execute scripted row-count and checksum queries to confirm that the target contains an identical data set before users are redirected.

Which end-to-end approach best meets all stated constraints while following Google-recommended phases for infrastructure preparation, data transfer, validation, and final cut-over?

    1. Provision a regional Cloud Spanner instance; 2) Use BigQuery Data Transfer Service to load Oracle export files into BigQuery and then into Spanner; 3) Validate with Dataform assertions; 4) Reconfigure the application to use Spanner.
    1. Create a highly-available Cloud SQL for PostgreSQL instance; 2) Use Database Migration Service in continuous-migration mode to perform the initial load from Oracle and begin change-data-capture replication over Dedicated Interconnect; 3) Monitor replication lag until it is under 15 minutes, then stop application writes; 4) Run the scripted row-count and checksum queries on both databases, verify parity, and execute DMS cut-over to promote the Cloud SQL instance; 5) Update application connection strings and decommission the DMS job.
    1. Order a Transfer Appliance, copy the Oracle data files to it, and ship it to Google; 2) Restore the files into a single-node Cloud SQL for PostgreSQL instance; 3) Run validation queries; 4) Switch traffic after shutting down the on-prem database.
    1. Perform an Oracle Data Pump export while the application remains online; 2) Transfer the dump files to Cloud Storage via gsutil over Interconnect; 3) Import the data into Cloud SQL for PostgreSQL using pgloader; 4) Run validation queries and point applications to Cloud SQL.
GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot