A health-monitoring startup receives a global stream of 300 000 heart-rate readings every second, amounting to about 90 MB/s of incoming data. The platform must continuously ingest this stream, allow patient dashboards to retrieve the most recent reading for any individual with a 95th-percentile latency below 10 ms, and retain 30 days of history so that a nightly batch job can scan tens of billions of rows for population-level analytics. Which Google Cloud storage approach best satisfies these requirements while minimizing operational overhead?
Insert all readings into Cloud SQL (PostgreSQL) with high-availability replicas for dashboards and export CSV files to Cloud Storage for batch analytics.
Persist readings in Cloud Bigtable for ingestion and low-latency lookups, then export daily snapshots to BigQuery for analytical scans.
Write JSON files to Cloud Storage with object versioning; dashboards fetch objects via signed URLs, and Dataflow reads them nightly for analytics in BigQuery.
Stream records directly into BigQuery and serve both dashboards and analytics from the same table.
Cloud Bigtable is designed for very high write throughput-scaling to millions of rows per second-and supports single-digit-millisecond point reads, making it well suited to time-series heart-rate telemetry. Using a row-key schema that combines patient ID with event timestamp allows dashboards to fetch the latest reading within the required latency. Because Bigtable automatically handles sharding and replication, operational overhead stays low. A daily export (via Dataflow or an external table) to BigQuery lets analysts run cost-effective SQL scans over 30 days of history.
The alternative approaches do not meet all requirements:
BigQuery alone can ingest streaming data but interactive point lookups typically take seconds rather than milliseconds.
Cloud Storage offers economical retention but cannot deliver sub-10 ms reads for individual records.
Cloud SQL would need extensive sharding and still struggle to sustain hundreds of thousands of writes per second while keeping latency in the single-millisecond range.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
Why is Cloud Bigtable the best choice for this use case?
Open an interactive chat with Bash
How does the row-key schema in Cloud Bigtable improve efficiency?
Open an interactive chat with Bash
What is the process for exporting data from Cloud Bigtable to BigQuery for analytics?
Open an interactive chat with Bash
Why is Cloud Bigtable suitable for high write throughput and low-latency lookups?
Open an interactive chat with Bash
How does using a compound row key improve performance in Cloud Bigtable?
Open an interactive chat with Bash
What are the advantages of exporting data from Cloud Bigtable to BigQuery for analytics?
Open an interactive chat with Bash
GCP Professional Data Engineer
Storing the data
Your Score:
Report Issue
Bash, the Crucial Exams Chat Bot
AI Bot
Loading...
Loading...
Loading...
Pass with Confidence.
IT & Cybersecurity Package
You have hit the limits of our free tier, become a Premium Member today for unlimited access.
Military, Healthcare worker, Gov. employee or Teacher? See if you qualify for a Community Discount.
Monthly
$19.99 $11.99
$11.99/mo
Billed monthly, Cancel any time.
$19.99 after promotion ends
3 Month Pass
$44.99 $26.99
$8.99/mo
One time purchase of $26.99, Does not auto-renew.
$44.99 after promotion ends
Save $18!
MOST POPULAR
Annual Pass
$119.99 $71.99
$5.99/mo
One time purchase of $71.99, Does not auto-renew.
$119.99 after promotion ends
Save $48!
BEST DEAL
Lifetime Pass
$189.99 $113.99
One time purchase, Good for life.
Save $76!
What You Get
All IT & Cybersecurity Package plans include the following perks and exams .