🔥 40% Off Crucial Exams Memberships — Deal ends today!

10 minutes, 44 seconds remaining!

GCP Professional Data Engineer Practice Question

A fintech firm ingests millions of market-data events per second through Pub/Sub and Dataflow. Trading engines must fetch the latest two hours of tick data with sub-10 ms latency during market hours, while quants run daily SQL analyses across seven years of historical ticks with minimal operational overhead. Which Google Cloud storage design best aligns with these distinct access patterns?

  • Store all tick data in Spanner with regional nodes for the trading engines and read replicas for analytics.

  • Write ticks to Cloud Storage in Parquet format and query them via BigQuery external tables for both real-time and historical needs.

  • Persist all ticks solely in BigQuery and rely on materialized views for the trading engines.

  • Stream ticks into Bigtable for the most recent hours and replicate all events into BigQuery for historical analytics.

GCP Professional Data Engineer
Storing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot