GCP Professional Cloud Architect Practice Question

Your team has a Kubeflow v2 pipeline running on Vertex AI Pipelines. Compliance now requires that each time a new CSV arrives in gs://retail-landing the pipeline must run automatically; every execution must store complete dataset, code, and hyper-parameter lineage for audit; and a human must approve or reject deployment to the production endpoint from the Cloud Console. Which architecture satisfies all requirements while minimising custom code maintenance?

  • Implement a Cloud Composer DAG that polls the landing bucket, triggers Dataflow preprocessing and Vertex AI training, then pauses using a Slack-based approval before deployment while maintaining a separate metadata database.

  • Use Eventarc to trigger Cloud Build on each new object; the build runs gcloud ai custom-jobs create and relies on Cloud Build logs plus Cloud Deploy approvals for promotion.

  • Send Cloud Storage notifications to Pub/Sub, trigger a Cloud Workflows execution that calls the Vertex AI Pipelines REST API with the pre-compiled pipeline, include a built-in Manual Approval step before deployment, and rely on Vertex ML Metadata for lineage.

  • Configure Cloud Storage object-create notifications to Pub/Sub, invoke a Cloud Functions subscriber that starts a Dataproc training job and writes custom logs for traceability.

GCP Professional Cloud Architect
Managing and provisioning a solution infrastructure
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot