🔥 40% Off Crucial Exams Memberships — Deal ends today!

1 hour, 26 minutes remaining!

GCP Professional Cloud Security Engineer Practice Question

Your fraud analytics team loads card-transaction CSVs from Cloud Storage into BigQuery for Vertex AI model training. Compliance requires that primary account numbers (PANs) be tokenized so data scientists cannot view real values but training still sees realistic formats. You need an automated pipeline that discovers PANs, replaces each one with a consistent, format-preserving token, and writes the sanitized data to a separate BigQuery table. Which solution meets the requirement?

  • Create a Confidential VM-based Vertex AI Workbench environment and rely on encrypted memory to prevent exposure of PANs during analysis.

  • Run a Sensitive Data Protection discovery scan on the Cloud Storage bucket and export the findings to Cloud Logging; instruct data scientists to ignore the PAN column.

  • Configure a Sensitive Data Protection inspection job that detects the built-in CREDIT_CARD_NUMBER infoType and applies a de-identification template using format-preserving encryption; output the job to a new BigQuery table used for training.

  • Enable BigQuery column-level security with policy tags on the PAN column and give data scientists access only through an authorized view.

GCP Professional Cloud Security Engineer
Ensuring data protection
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot