GCP Professional Data Engineer Practice Question

Your company uses three analytics applications: a Java-based reporting server that needs a JDBC data source, a Windows modeling tool that supports only ODBC, and a serverless web dashboard that can invoke REST endpoints. You must give all three applications live access to a BigQuery dataset in Standard SQL, using a single service account and without writing custom connectors. Which approach meets the requirement with the least development effort?

  • Schedule nightly exports of the BigQuery tables to Cloud Storage as Avro files and let each application read the files through its file-based connectors.

  • Create a Cloud SQL for PostgreSQL replica of the BigQuery dataset with BigQuery Omni and point all three applications at the PostgreSQL endpoint through their existing JDBC or ODBC drivers.

  • Install Google's BigQuery JDBC driver for the Java server, the BigQuery ODBC driver for the Windows tool, and have the web dashboard call the BigQuery REST API; all three use the same service-account credentials to query BigQuery directly.

  • Use Dataflow to continuously stream BigQuery data into an on-premises MySQL database, then have every application connect to MySQL using its native JDBC or ODBC drivers.

GCP Professional Data Engineer
Preparing and using data for analysis
Your Score:
Settings & Objectives
Random Mixed
Questions are selected randomly from all chosen topics, with a preference for those you haven’t seen before. You may see several questions from the same objective or domain in a row.
Rotate by Objective
Questions cycle through each objective or domain in turn, helping you avoid long streaks of questions from the same area. You may see some repeat questions, but the distribution will be more balanced across topics.

Check or uncheck an objective to set which questions you will receive.

Bash, the Crucial Exams Chat Bot
AI Bot