GCP Professional Data Engineer Practice Question

Your trading platform ingests about 5 million JSON trade events per second from globally distributed exchanges. Events are append-only and must be retrieved within a few milliseconds by a risk-evaluation service that scans all trades from the most recent hour using the event timestamp. Data should be kept for 30 days and then bulk-exported to BigQuery for historical analytics. Which Google Cloud storage service best fits the raw event store, given these ingestion, low-latency, and range-scan requirements?

  • Store the events in Cloud Bigtable using a row key that starts with an inverted timestamp followed by the trade ID.

  • Store each event as a document in Firestore (Native mode) keyed by trade ID, and query recent trades with composite indexes.

  • Stream each event into a partitioned BigQuery table and let the risk-evaluation service query the latest partition.

  • Batch events into hourly JSON files in a Cloud Storage Nearline bucket and have the service read the most recent file.

GCP Professional Data Engineer
Storing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot