GCP Professional Data Engineer Practice Question

A fintech company stores transactional data in several BigQuery datasets in the europe-west2 region. Compliance requires 90 days of point-in-time backups that can be restored within one hour if a table or partition is accidentally overwritten. The operations team wants the lowest incremental storage cost and a fully managed solution that remains entirely inside BigQuery, with no data exports. Which strategy should you recommend?

  • Schedule daily exports of each dataset to a Nearline Cloud Storage bucket and rely on object versioning for 90-day retention.

  • Increase dataset-level time-travel retention to 90 days and use time travel to recover data when needed.

  • Run nightly COPY jobs to duplicate each table into another region and delete copies older than 90 days.

  • Configure daily BigQuery table snapshots for every critical table and place them in a separate backup project with a 90-day expiration policy.

GCP Professional Data Engineer
Designing data processing systems
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot