🔥 40% Off Crucial Exams Memberships — Deal ends today!

3 hours, 1 minute remaining!

GCP Professional Data Engineer Practice Question

Your e-commerce analytics team issues ad-hoc interactive queries against a 180-TB BigQuery table that stores 90 days of click-stream events. The project is billed with BigQuery's on-demand model, and daily query volume fluctuates, making long-term slot commitments unattractive. Analysts usually inspect only the most recent three days of data, but each query currently scans the full table, driving up costs. To lower query charges while continuing to use on-demand pricing, which approach should you implement?

  • Apply gzip compression to the existing table so the bytes scanned by each query are smaller.

  • Upgrade to BigQuery Enterprise Edition and buy a 500-slot reservation to run queries on flat-rate capacity.

  • Partition the table by date and require queries to include a filter on the partitioning column so only recent partitions are scanned.

  • Export the data to Cloud Storage and query it as a BigLake external table, eliminating per-query charges.

GCP Professional Data Engineer
Maintaining and automating data workloads
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot