GCP Professional Data Engineer Practice Question

Your company streams 5 PB of user-generated HD videos and ingests 50 TB of new content each day. Viewers around the world expect video playback to begin in under two seconds. Product managers require a real-time dashboard that shows per-video engagement within 50 ms, and analysts need to run ad-hoc SQL queries on several months of usage logs. Compliance rules mandate that every uploaded video be retained for seven years at the lowest possible cost. Which Google Cloud storage architecture best meets all of these requirements?

  • Write video binaries directly into Bigtable; cache engagement metrics in Memorystore; archive logs to Cloud Spanner tables for analytical queries.

  • Host videos on a Filestore Enterprise NFS share replicated across regions; store engagement metrics in Cloud SQL; process logs with Dataproc running Hadoop on regional Persistent Disk.

  • Store videos in a multi-region Cloud Storage bucket with lifecycle rules that move objects to Archive after 90 days; keep engagement metrics in Bigtable; ingest logs into BigQuery for ad-hoc SQL analysis.

  • Persist both videos and logs as BLOB columns in Cloud Spanner with multi-region configuration; store engagement metrics in Firestore and run analytics queries directly against Spanner.

GCP Professional Data Engineer
Storing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot