GCP Professional Data Engineer Practice Question

Your SaaS platform ingests status events from 500 000 customer-deployed sensors every few seconds. Today the only requirement is near-real-time dashboards in BigQuery, but product teams expect to add fraud-detection microservices and a mobile alerting system later this year. You must design an ingestion architecture that meets current needs while allowing additional downstream consumers to process the same events without any changes to the sensor firmware. Which approach best satisfies these goals?

  • Publish every sensor event to a Cloud Pub/Sub topic; run a streaming Dataflow pipeline that subscribes and writes to BigQuery, and let future services create additional subscriptions to the same topic.

  • Have each sensor call the BigQuery streaming insert API; dashboards query BigQuery and future services read the data through BigQuery views.

  • Write events into a regional Cloud Spanner database and use change streams to replicate data into BigQuery; future services query the Spanner instance directly.

  • Upload batched JSON files from sensors to Cloud Storage using signed URLs; trigger Cloud Functions to load the files into BigQuery, and let future services read the files directly from the bucket.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot