Your company operates a music-streaming service in Europe. All raw user interaction events (which contain PII) must never leave EU borders to comply with GDPR, but a US-based analytics team needs daily aggregated engagement metrics loaded into BigQuery in us-central1. You must design a fully managed Google Cloud solution that enforces these residency constraints while minimizing operational overhead. Which architecture should you recommend?
Ingest events via Pub/Sub and Dataflow in europe-west1 but write raw data to a BigQuery dataset in us-central1 that is protected with CMEK keys stored in an EU key ring, then share aggregated views with US analysts.
Publish events to a regional Pub/Sub topic in europe-west1; process them with a streaming Dataflow job in europe-west1 that writes raw data to a BigQuery dataset in the EU multi-region; run a daily batch Dataflow job in europe-west1 that computes aggregated metrics, exports them to a Cloud Storage bucket in us-central1, and then loads them into a BigQuery dataset in us-central1 for the analytics team.
Publish events to a global Pub/Sub topic; process them with a Dataflow job in us-central1 that writes raw events directly to a BigQuery dataset in us-central1; use an authorized view to give EU teams access to the data.
Stream events to a Pub/Sub topic in europe-west1 and store them in a BigQuery dataset in the EU multi-region, then enable BigQuery cross-region replication to automatically copy the dataset to us-central1 for the US analytics team.
Keeping the entire ingestion path for raw events inside an EU region is mandatory to meet the GDPR data-residency rule. Publishing to a europe-west1 Pub/Sub topic and processing the stream with a regional Dataflow job that writes into a BigQuery dataset located in the EU multi-region guarantees that raw PII never crosses borders. Aggregations can then be computed inside the same region and written to an EU-resident temporary table; a daily batch Dataflow job can export only the aggregated, de-identified results to a Cloud Storage bucket in us-central1 and load them into a BigQuery dataset in that region for the US analysts. The other options violate residency or misunderstand CMEK: landing raw data in us-central1 or enabling automatic cross-region replication moves PII outside the EU, and storing encryption keys in the EU does not change the physical location of data stored in us-central1.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why is Pub/Sub used in this architecture?
Open an interactive chat with Bash
What is the role of CMEK keys mentioned in an incorrect option?
Open an interactive chat with Bash
How does the architecture ensure aggregated metrics comply with GDPR?
Open an interactive chat with Bash
GCP Professional Data Engineer
Designing data processing systems
Your Score:
Report Issue
Bash, the Crucial Exams Chat Bot
AI Bot
Loading...
Loading...
Loading...
Pass with Confidence.
IT & Cybersecurity Package
You have hit the limits of our free tier, become a Premium Member today for unlimited access.
Military, Healthcare worker, Gov. employee or Teacher? See if you qualify for a Community Discount.
Monthly
$19.99
$19.99/mo
Billed monthly, Cancel any time.
3 Month Pass
$44.99
$14.99/mo
One time purchase of $44.99, Does not auto-renew.
MOST POPULAR
Annual Pass
$119.99
$9.99/mo
One time purchase of $119.99, Does not auto-renew.
BEST DEAL
Lifetime Pass
$189.99
One time purchase, Good for life.
What You Get
All IT & Cybersecurity Package plans include the following perks and exams .