🔥 40% Off Crucial Exams Memberships — Deal ends today!

44 minutes, 55 seconds remaining!

GCP Professional Data Engineer Practice Question

Your ecommerce company keeps 50 TB of click-stream data in partitioned BigQuery tables. Analysts issue unpredictable interactive queries that process about 10 TB of data most months, but quarterly marketing campaigns spike query volume to roughly 100 TB. Finance needs a predictable, easy-to-forecast analytics bill, and engineers want to keep all data online while minimizing administrative effort. Partitions older than 90 days are seldom queried. Which approach best meets these goals?

  • Use flex-slot reservations only during quarterly spikes while leaving other queries on on-demand pricing, and schedule a job to copy idle partitions to a separate long-term table.

  • Switch BigQuery to on-demand analysis pricing, set project-level cost controls, and export partitions older than 90 days to Cloud Storage Archive to save on storage.

  • Keep on-demand analysis pricing, create materialized views to limit scanned bytes, and delete table data after 90 days to avoid long-term storage charges.

  • Purchase an annual baseline reservation of BigQuery slots with autoscaling enabled and rely on BigQuery's automatic long-term storage pricing for partitions idle more than 90 days.

GCP Professional Data Engineer
Storing the data
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot