GCP Professional Cloud Architect Practice Question

You manage a BigQuery ingestion-time partitioned table called logs.raw in project analytics-prod (location US). A Cloud Scheduler job must run each morning and, using the bq CLI, export only the partition for the previous UTC day to Cloud Storage as compressed Newline-delimited JSON. The export must overwrite any existing objects and automatically split the output into multiple shards if it exceeds the per-file size limit. Which command pattern meets all of these requirements?

  • bq extract --destination_format=NEWLINE_DELIMITED_JSON analytics-prod:logs.raw@YYYYMMDD000000-YYYYMMDD235959 gs://acme-exports/logs/raw-range.json.gz

  • bq extract --compression=GZIP analytics-prod:logs.raw$YYYYMMDD gs://acme-exports/logs/raw.json

  • bq extract --destination_format=NEWLINE_DELIMITED_JSON --compression=GZIP analytics-prod:logs.raw$YYYYMMDD gs://acme-exports/logs/raw-*.json.gz

  • bq extract --destination_format=JSON --compression=GZIP analytics-prod:logs.raw gs://acme-exports/logs/raw.json.gz

GCP Professional Cloud Architect
Managing implementation
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot