🔥 40% Off Crucial Exams Memberships — Deal ends today!

1 hour, 50 minutes remaining!

GCP Professional Data Engineer Practice Question

A health-monitoring startup receives a global stream of 300 000 heart-rate readings every second, amounting to about 90 MB/s of incoming data. The platform must continuously ingest this stream, allow patient dashboards to retrieve the most recent reading for any individual with a 95th-percentile latency below 10 ms, and retain 30 days of history so that a nightly batch job can scan tens of billions of rows for population-level analytics. Which Google Cloud storage approach best satisfies these requirements while minimizing operational overhead?

  • Insert all readings into Cloud SQL (PostgreSQL) with high-availability replicas for dashboards and export CSV files to Cloud Storage for batch analytics.

  • Persist readings in Cloud Bigtable for ingestion and low-latency lookups, then export daily snapshots to BigQuery for analytical scans.

  • Write JSON files to Cloud Storage with object versioning; dashboards fetch objects via signed URLs, and Dataflow reads them nightly for analytics in BigQuery.

  • Stream records directly into BigQuery and serve both dashboards and analytics from the same table.

GCP Professional Data Engineer
Storing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot