GCP Professional Data Engineer Practice Question

Your e-commerce platform must ingest about 200 000 JSON clickstream events per second from users around the world. Engineers need millisecond-level reads on the most recent minute of traffic for real-time personalization, while analysts expect sub-second ad-hoc SQL queries on six months of historical data. You want a fully managed design that minimizes operational overhead and controls storage cost. Which architecture best satisfies both requirements?

  • Persist events in Firestore (Native mode) and enable automatic export of the collection to BigQuery for analysis.

  • Stream events into Cloud Bigtable for serving reads; use Dataflow to dump snapshots to Cloud Storage and load them into partitioned BigQuery tables for analytics.

  • Write events to Memorystore for Redis for low-latency access; export cached keys nightly to BigQuery using Cloud Functions.

  • Stream events into Cloud SQL (PostgreSQL) via Dataflow; create read replicas and use federated queries for analytics.

GCP Professional Data Engineer
Storing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot