00:20:00

Microsoft Azure Data Fundamentals Practice Test (DP-900)

Use the form below to configure your Microsoft Azure Data Fundamentals Practice Test (DP-900). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

Logo for Microsoft Azure Data Fundamentals DP-900
Questions
Number of questions in the practice test
Free users are limited to 20 questions, upgrade to unlimited
Seconds Per Question
Determines how long you have to finish the practice test
Exam Objectives
Which exam objectives should be included in the practice test

Microsoft Azure Data Fundamentals DP-900 Information

The Microsoft Azure Data Fundamentals certification exam is designed to validate foundational knowledge of core data concepts and how they are implemented using Microsoft Azure data services. It is ideal for individuals who are new to data workloads and cloud environments. Using DP-900 practice tests, practice exams, and reviewing many practice questions can help candidates build confidence, familiarise themselves with exam language, and strengthen their grasp of key topics.

The exam covers four major domains: describing core data concepts (25-30%), identifying considerations for relational data on Azure (20-25%), describing non-relational data on Azure (15-20%), and describing analytics workloads on Azure (25-30%). To prepare effectively, leveraging full‐length practice exams and targeted practice questions focused on each domain will help you identify weak areas, improve your timing, and enhance your readiness for the real exam experience.

Practice Exams & Practice Questions

Success on the DP-900 exam isn’t just about recalling facts, you'll need to apply them under timed conditions. Using DP-900 practice tests helps simulate the exam environment, while drilling practice questions for each objective ensures your understanding is solid. Practice exams expose you to question types like case studies, drag-and-drop, multiple‐choice and multiple‐response, allowing you to manage pacing and reduce surprises on exam day. With consistent work on practice exams and practice questions, you’ll go into the exam with increased confidence and reduce the chance of needing a retake.

Microsoft Azure Data Fundamentals DP-900 Logo
  • Free Microsoft Azure Data Fundamentals DP-900 Practice Test

  • 20 Questions
  • Unlimited
  • Describe core data concepts
    Identify considerations for relational data on Azure
    Describe considerations for working with non-relational data on Azure
    Describe an analytics workload
Question 1 of 20

Your organization needs an analytical store in Azure that can hold many terabytes of structured data and execute complex SQL queries by distributing the workload across multiple compute nodes. Which Azure service is designed for this Massively Parallel Processing (MPP) scenario?

  • Azure Synapse Analytics dedicated SQL pool

  • Azure Data Lake Storage Gen2

  • Azure Cosmos DB

  • Azure SQL Database single database

Question 2 of 20

You administer an Azure SQL Database. Developers repeatedly submit a multitable join to produce a product sales summary. They want to refer to that result set by a single object name without duplicating data. Which database object best meets this requirement?

  • Create a nonclustered index on the joined columns.

  • Create an AFTER INSERT trigger to populate a summary table.

  • Create a view based on the join.

  • Create a table-valued function that returns the join results.

Question 3 of 20

You have a dataset with the columns Product Category, Year, and Total Sales. You must display how total sales for each category change across years so users can easily compare yearly trends. Which Power BI visual is most appropriate for this requirement?

  • Donut chart

  • Line chart

  • Card visual

  • Scatter chart

Question 4 of 20

Which of the following characteristics most clearly indicates that a database workload is transactional rather than analytical?

  • It requires every write operation to be committed using ACID properties to maintain data consistency.

  • It stores raw, semi-structured logs in a data lake using schema-on-read.

  • It primarily refreshes read-only dashboards with nightly batch loads.

  • It focuses on scanning petabytes of historical data to produce complex aggregations.

Question 5 of 20

You are reviewing objects in an Azure SQL Database. Which database object is primarily used to pre-compute and persist an ordered data structure that the query optimizer can use to accelerate data retrieval, without storing duplicate copies of the table's rows?

  • Trigger

  • View

  • Index

  • Stored procedure

Question 6 of 20

You plan to store semi-structured JSON documents in Azure Cosmos DB. You want to query the data by using a familiar SQL-like syntax and benefit from automatic indexing without defining a schema. Which Azure Cosmos DB API should you select?

  • Cassandra API

  • Core (SQL) API

  • Azure Cosmos DB for MongoDB API

  • Gremlin API

Question 7 of 20

Your company is building a photo-sharing website hosted on Azure. Each uploaded picture must be stored as an individual object, remain highly available, be accessible directly over HTTPS by a unique URL, and automatically move to cooler storage tiers as it ages to reduce costs. Which Azure storage service is the best fit for this scenario?

  • Azure File shares

  • Azure Blob storage

  • Azure Table storage

  • Azure Managed Disks

Question 8 of 20

An organization is designing an Azure-based analytics platform. The team wants to ingest large volumes of raw structured and unstructured files, keep storage independent from processing clusters, and rely on open file formats such as Parquet for future analysis. Which analytical data store should they implement?

  • An operational relational database hosted in Azure SQL Database

  • A dedicated SQL pool in Azure Synapse Analytics

  • A data lake that uses Azure Data Lake Storage Gen2

  • A globally distributed NoSQL database using Azure Cosmos DB

Question 9 of 20

You need to store high-resolution marketing images in Azure. Each image is downloaded often during the first month after upload; afterwards, it is rarely accessed but must remain available on demand at the lowest possible storage cost. Which Azure Blob storage feature should you use to meet this requirement without moving the data to another service?

  • Upgrade the storage account to use geo-redundant replication.

  • Enable Azure Files caching for the image container.

  • Store the images in Azure Table storage instead of blobs.

  • Move the blobs to the Cool access tier after 30 days.

Question 10 of 20

A marketing team wants to build an interactive dashboard in Power BI by simply typing questions such as "What were online sales last quarter?" and having the visual update automatically. Which Power BI capability should you recommend to meet this requirement?

  • Row-level security

  • Q&A visual (natural language query)

  • Quick Insights

  • Power Query Editor

Question 11 of 20

You need to store several terabytes of application log files and user-uploaded images. The data is unstructured, must be retrievable over HTTPS through REST APIs, and should automatically transition to cooler storage tiers as it ages. Which Azure storage service should you use?

  • Azure Table storage

  • Azure File storage

  • Azure Queue storage

  • Azure Blob storage

Question 12 of 20

Your company is designing a large-scale analytics architecture in Azure. The team needs a storage layer that can ingest and keep raw structured, semi-structured, and unstructured data without applying a schema up front, while scaling to petabytes at a comparatively low cost. Which type of analytical data store should they choose?

  • An in-memory cache hosted on Azure Cache for Redis

  • A relational data warehouse using Azure Synapse dedicated SQL pool

  • A data lake built on Azure Data Lake Storage Gen2

  • A NoSQL key-value store implemented with Azure Cosmos DB Table API

Question 13 of 20

Your cloud app writes a new line of diagnostic data to the same file in Azure Blob Storage every minute. The existing content is never modified-data is only added to the end of the file. Which Azure blob type should you choose to optimize storage and write performance for this append-only pattern?

  • Azure File share

  • Append blob

  • Block blob

  • Page blob

Question 14 of 20

A developer stores sensor readings as JSON files in Azure Blob Storage. Each file contains key-value pairs, but no fixed table schema is enforced. How should this data be classified for reporting purposes?

  • Unstructured data

  • Semi-structured data

  • Binary large objects (BLOBs)

  • Structured data

Question 15 of 20

Your Power BI data model includes a Date table with separate Year, Quarter, Month, and Day columns. Report users want to drill down from yearly totals to specific days within a single column chart, without manually adding columns in the visual. Which data-modeling feature should you configure to meet this requirement?

  • Create a hierarchy that contains the Year, Quarter, Month, and Day columns

  • Write a DAX measure to return totals at different date levels

  • Apply row-level security based on the Date table

  • Define a calculated column that concatenates Year, Quarter, Month, and Day

Question 16 of 20

A retailer plans to load several terabytes of historical point-of-sale data into an Azure Synapse Analytics dedicated SQL pool so that analysts can run quarterly sales trend reports. Which feature is typical of this analytical workload?

  • Queries are mostly read-intensive and return aggregated results across large data sets.

  • Data must be updated in real time with sub-second latency.

  • Strict row-level locking is required to prevent concurrent reads.

  • The workload consists of many small, short-lived transactions that modify individual rows.

Question 17 of 20

A company lands high-volume sensor data as CSV files in Azure Data Lake Storage Gen2. Each night the data must be converted to Parquet, cleaned, and loaded into an Azure Synapse Analytics dedicated SQL pool for reporting. Which Azure data role typically designs and schedules the pipelines that perform this ETL process?

  • Security engineer

  • Data engineer

  • Data scientist

  • Database administrator

Question 18 of 20

You need to choose an Azure service to store large volumes of raw application log files produced every day in JSON and CSV formats. The files are several megabytes to gigabytes in size, must be accessed over HTTPS, should support tiered storage to reduce cost, and will later be ingested by Azure Data Factory for analytics. Which Azure storage option should you use?

  • Azure SQL Database

  • Azure Blob Storage

  • Azure Queue Storage

  • Azure Table Storage

Question 19 of 20

An application experiences short, unpredictable bursts of traffic and then remains idle for several hours. You need a Microsoft-hosted relational service that can automatically pause compute during idle periods, resume in roughly a minute when new requests arrive, and bill you only for the compute actually used. Which Azure service meets these requirements?

  • Azure SQL Database with the serverless compute tier

  • Azure SQL Managed Instance (General Purpose tier)

  • SQL Server installed on an Azure Virtual Machine

  • Azure Database for PostgreSQL Flexible Server

Question 20 of 20

Your company is deploying a new Azure SQL Database for a line-of-business app. Which activity is most likely assigned to the database administrator role rather than to a data engineer or data analyst?

  • Configure and manage the database's backup schedule and perform restores when needed.

  • Develop DAX measures and visualizations in Power BI dashboards for business users.

  • Train a customer-churn predictive model by using Azure Machine Learning datasets.

  • Build an end-to-end data pipeline that ingests streaming IoT data into the database.