🔥 40% Off Crucial Exams Memberships — Deal ends today!

1 hour, 51 minutes remaining!

GCP Professional Data Engineer Practice Question

Your company ingests hundreds of millions of payment events per day into a partitioned BigQuery table through Dataflow for near-real-time analytics. A newly built back-office application must apply multi-row corrections and read the updated balances within the same transaction. You must meet this strict ACID requirement while preserving the existing analytic workflows and avoiding manual capacity management as data volume grows. What should you do?

  • Ingest the events into Cloud Spanner instead, let the back-office service perform its transactional updates there, and expose the data to analysts by querying Cloud Spanner directly from BigQuery through a connection.

  • Periodically export the BigQuery table to Cloud Storage, run Dataflow to apply corrections, and reload the cleansed dataset into a replacement BigQuery table.

  • Keep all data in BigQuery and have the back-office service issue multi-statement transactions with UPDATE statements through the BigQuery API.

  • Migrate the dataset to Cloud SQL for PostgreSQL, scale the instance vertically as volume grows, and replicate changes to BigQuery nightly using Datastream for analytics.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot