🔥 40% Off Crucial Exams Memberships — This Week Only

3 days, 14 hours remaining!
00:20:00

Microsoft Azure Developer Associate Practice Test (AZ-204)

Use the form below to configure your Microsoft Azure Developer Associate Practice Test (AZ-204). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

Logo for Microsoft Azure Developer Associate AZ-204
Questions
Number of questions in the practice test
Free users are limited to 20 questions, upgrade to unlimited
Seconds Per Question
Determines how long you have to finish the practice test
Exam Objectives
Which exam objectives should be included in the practice test

Microsoft Azure Developer Associate AZ-204 Information

The Microsoft Azure Developer Associate (AZ-204) certification is a crucial credential for cloud developers specializing in the Microsoft Azure ecosystem. This exam is designed for professionals who are responsible for all phases of the development lifecycle, including gathering requirements, design, development, deployment, security, maintenance, performance tuning, and monitoring. Candidates should have 1-2 years of professional development experience, including hands-on experience with Microsoft Azure. The exam validates a developer's proficiency in leveraging Azure's tools, SDKs, and APIs to build and maintain cloud applications and services.

The AZ-204 exam assesses a broad set of skills across five primary domains. These areas include developing Azure compute solutions (25-30%), developing for Azure storage (15-20%), implementing Azure security (15-20%), monitoring, troubleshooting, and optimizing Azure solutions (5-10%), and connecting to and consuming Azure services and third-party services (20-25%). The exam itself consists of 40-60 questions and has a duration of about 100 minutes. The question formats can vary, including multiple-choice, scenario-based questions, and drag-and-drop tasks.

The Value of Practice Exams in Preparation

A critical component of a successful study plan for the AZ-204 exam is the use of practice tests. Taking practice exams offers several key benefits that go beyond simply memorizing facts. They help you become familiar with the style, wording, and difficulty of the questions you are likely to encounter on the actual exam. This familiarity can help reduce anxiety and improve time management skills during the test.

Furthermore, practice exams are an excellent tool for self-assessment. They allow you to gauge your readiness, identify areas of weakness in your knowledge, and focus your study efforts accordingly. By reviewing your answers, especially the incorrect ones, you can gain a deeper understanding of how different Azure services work together to solve real-world problems. Many candidates find that simulating exam conditions with timed practice tests helps build the confidence needed to think clearly and methodically under pressure. Microsoft itself provides a practice assessment to help candidates prepare and fill knowledge gaps, increasing the likelihood of passing the exam.

Microsoft Azure Developer Associate AZ-204 Logo
  • Free Microsoft Azure Developer Associate AZ-204 Practice Test

  • 20 Questions
  • Unlimited time
  • Develop Azure compute solutions
    Develop for Azure storage
    Implement Azure security
    Monitor and troubleshoot Azure solutions
    Connect to and consume Azure services and third-party services
Question 1 of 20

You are developing a .NET worker service that uploads images to an Azure Storage container. Each image must be stored with the content type set to "image/png" and with custom metadata key "source" set to "webapp". To minimize latency and transaction costs, you want to satisfy both requirements with a single service request. Which SDK call should you use?

  • Call BlobClient.UploadAsync(BinaryData content, overwrite: true).

  • Call BlobClient.StartCopyFromUriAsync with BlobCopyFromUriOptions that include metadata and headers.

  • Call BlobClient.UploadAsync(content, new BlobUploadOptions { HttpHeaders = new BlobHttpHeaders , Metadata = new Dictionary<string,string> { { "source", "webapp" } } });

  • First call BlobClient.SetMetadataAsync then call BlobClient.SetHttpHeadersAsync for the blob.

Question 2 of 20

An HTTP-triggered Azure Functions app runs on the Consumption plan. Some requests perform intensive processing and take up to 25 minutes. Users report that these requests always fail after roughly 10 minutes. You must allow the function to complete successfully without modifying the function's code. Which action should you take?

  • Move the app to an Azure Functions Premium plan and set functionTimeout to 30 minutes in host.json.

  • Increase the instance count of the Consumption plan to its maximum scale-out limit.

  • Update host.json to set functionTimeout to 30 minutes while keeping the app on the Consumption plan.

  • Enable Always On for the existing Consumption plan function app.

Question 3 of 20

You have the source code and a Dockerfile in the current directory on your workstation. The workstation lacks Docker engine and has limited CPU resources. You need to build the image in Azure and push it to an Azure Container Registry named contosoacrdemo, tagging the resulting image as web:v1. Which Azure CLI command should you run?

  • docker build -t contosoacrdemo.azurecr.io/web:v1 .

  • az container create --registry-login-server contosoacrdemo.azurecr.io --image web:v1 .

  • az acr task create --registry contosoacrdemo --name buildweb --image web:v1 --context .

  • az acr build --registry contosoacrdemo --image web:v1 .

Question 4 of 20

You manage an Azure API Management instance that contains a product named Public and an API named Weather that is associated with the product. Every operation in API Management currently requires an Ocp-Apim-Subscription-Key header. You are migrating the Weather API to use Azure AD OAuth 2.0 bearer tokens instead of subscription keys. Calls to Weather must succeed when a valid Azure AD token is supplied even if no subscription key is present, and the requirement must not affect the other APIs in the Public product. What should you do in the Azure portal to meet the requirement?

  • Enable the global "Bypass subscription key" setting for the API Management gateway.

  • In the Settings blade of the Public product, clear the "Subscription required" option and save the change.

  • In the Settings blade of the Weather API, clear the "Subscription required" option and save the change.

  • Add a validate-jwt policy to the inbound section of the Weather API.

Question 5 of 20

You are building an ASP.NET Core worker service that runs inside an AKS cluster and pings several internal HTTP endpoints every minute. The endpoints are private and must not be tested from the public internet, but you want the results to appear in the Application Insights Availability blade and support alerting. Which implementation meets the requirement?

  • Create an AvailabilityTelemetry object for each probe and send it with TelemetryClient.TrackAvailability.

  • Publish a custom numeric value with TelemetryClient.TrackMetric representing success (1) or failure (0) for each probe.

  • Enable codeless Application Insights auto-instrumentation in the cluster and depend on Smart Detection to raise availability alerts.

  • Use TelemetryClient.TrackDependency when calling each endpoint and treat failures as availability issues.

Question 6 of 20

You are developing an Azure Container App named orders-worker that processes messages from an Azure Service Bus queue. The app must run zero replicas when the queue is empty but scale out to as many as 20 replicas as the number of pending messages grows. HTTP ingress is disabled. Which scaling rule type should you configure for orders-worker to meet these requirements?

  • Add a cron scaler rule that increases replica count during working hours.

  • Set minReplicas to 1 and configure CPU-based scaling only.

  • Define a KEDA scaling rule of type "azure-servicebus" with queueName and connection settings.

  • Enable external HTTP ingress and rely on the built-in HTTP autoscaler.

Question 7 of 20

Your e-commerce web app must allow unauthenticated clients to download pictures stored in an Azure Storage blob container during a marketing campaign that lasts two hours. After the campaign, you need a single action that immediately invalidates every URL you issued. Which technique should you implement?

  • A service-level SAS token that is associated with a stored access policy on the container.

  • A service-level SAS token created directly on each blob with a two-hour expiry.

  • Set the container access level to Blob (anonymous read access) and remove public access after two hours.

  • An account-level SAS token with read permissions limited to the blob service.

Question 8 of 20

You are developing a .NET 6 App Service web app that must upload files to an Azure Storage account by using the Azure SDK. The solution must avoid storing any credentials in code, and the identity must remain available even if the web app is deleted and recreated in another region. What should you do?

  • Create an Azure AD application with a client secret, store the secret in Azure Key Vault, and retrieve it at runtime by using DefaultAzureCredential.

  • Create a user-assigned managed identity, assign it the Storage Blob Data Contributor role on the storage account, and associate the identity with the web app.

  • Generate a service SAS token for the storage account and store the token in an App Service application setting.

  • Enable a system-assigned managed identity on the web app and grant it the Storage Blob Data Contributor role on the storage account.

Question 9 of 20

Your build script runs in Azure Cloud Shell and must build a Docker image from the current directory and publish it to an existing Azure Container Registry named contosoacr. The solution must not require the Cloud Shell environment to have the Docker engine installed. Which Azure CLI command should you run?

  • docker build -t contosoacr.azurecr.io/web:v1 . && docker push contosoacr.azurecr.io/web:v1

  • az acr import --name contosoacr --source . --image web:v1

  • az acr build --registry contosoacr --image web:v1 .

  • az container create --registry-login-server contosoacr.azurecr.io --image web:v1 --file Dockerfile

Question 10 of 20

You are developing a background service that runs on an Azure virtual machine and must read messages from users' mailboxes through Microsoft Graph without any interactive sign-in. You registered an app in Microsoft Entra ID, added the Mail.Read application permission, and will use MSAL.NET with the client-credentials flow. Which scope string should you supply to AcquireTokenForClient(...)?

Question 11 of 20

Your JavaScript single-page application (SPA) must call an Azure Function App that is secured by the Microsoft Identity platform. Both applications are registered in the same Microsoft Entra ID tenant. To ensure the SPA can obtain an access token that the Function App will accept, what should you configure in Azure AD?

  • Configure both registrations as public client/native applications.

  • Assign a system-assigned managed identity to the SPA and give it access to the Function App.

  • Expose a custom delegated scope in the Function App registration and grant that scope as an API permission to the SPA.

  • Enable the implicit grant flow (ID tokens) on the SPA registration only.

Question 12 of 20

Your ASP.NET Core API is already instrumented with the Application Insights .NET SDK. When an upstream service responds with HTTP 429, you want to record a diagnostic entry that is stored as a trace with severity Warning and that carries a custom dimension named RetryAfterSeconds. Which code snippet accomplishes this without any additional configuration changes?

  • telemetryClient.TrackEvent("Service throttling", new Dictionary<string,string> { { "RetryAfterSeconds", retryAfterSeconds.ToString() } }, null);

  • telemetryClient.TrackTrace("Service throttling", new Dictionary<string,string> { { "RetryAfterSeconds", retryAfterSeconds.ToString() } }, SeverityLevel.Warning);

  • telemetryClient.TrackTrace("Service throttling", TraceSeverity.Warning, new Dictionary<string,string> { { "RetryAfterSeconds", retryAfterSeconds.ToString() } });

  • telemetryClient.TrackTrace("Service throttling", SeverityLevel.Warning, new Dictionary<string,string> { { "RetryAfterSeconds", retryAfterSeconds.ToString() } });

Question 13 of 20

A general-purpose v2 storage account contains a container named payroll. Your application uploads each month's payroll PDFs to this container. For the first 30 days the files are read frequently; after that they are rarely accessed but must be kept for seven years. You need an automated solution that minimizes storage costs without affecting the first 30 days of usage. What should you do?

  • Create a lifecycle management rule that moves blobs in the payroll container to the Cool tier 30 days after the last modification and to the Archive tier 90 days after the last modification.

  • Modify the upload code so that each PDF is written directly to the Archive tier.

  • Set the default access tier of the payroll container to Cool so that new uploads are immediately stored in the Cool tier.

  • Enable blob soft delete on the storage account with a retention period of seven years.

Question 14 of 20

Your background service runs without user interaction and must periodically list all users in your Azure AD tenant by calling Microsoft Graph. Using the Microsoft Identity platform, which OAuth 2.0 grant type and Microsoft Graph permission type should you implement to meet the requirement?

  • Authorization code grant with a delegated permission such as User.Read.All

  • Device code grant with a delegated permission such as Group.Read.All

  • Client credentials grant with an application permission such as User.Read.All

  • Implicit grant with an application permission such as Directory.Read.All

Question 15 of 20

Your ASP.NET Core web app is instrumented with Application Insights and appears in Application Map. Users report pages that call an on-premises SQL Server run slowly. In the map you see a thick dependency line between the Web App node and the SQL node. What action inside Application Map lets you open a complete end-to-end trace of one of those slow calls to locate the delay?

  • Configure a Standard availability test and review its results in the Failures blade.

  • Start Live Metrics Stream from the toolbar to observe real-time server counters for the web app.

  • Select the dependency link and choose "Go to details" to open End-to-end transaction details.

  • Switch to the Usage workspace and open the Users view to evaluate average session duration.

Question 16 of 20

You are developing an ASP.NET Core Web API protected with the Microsoft Identity platform (v2 endpoint). Client apps will call the API either on behalf of a signed-in user (delegated flow) or as a daemon service (client-credentials flow). The API must programmatically verify the permission conveyed in the token. Which claim should the API evaluate in each scenario?

  • Delegated flow - check the scp (scope) claim; client-credentials flow - check the roles claim.

  • Delegated flow - check the roles claim; client-credentials flow - check the scp (scope) claim.

  • Delegated flow - check the groups claim; client-credentials flow - check the scope claim.

  • Delegated flow - check the aud claim; client-credentials flow - check the appid claim.

Question 17 of 20

You are creating an Azure Functions app for a production workload. The function must avoid cold starts by keeping at least one instance warm, scale out automatically to thousands of concurrent executions when traffic spikes, integrate with an Azure Virtual Network for secure data access, and incur no per-execution billing charges. When configuring the hosting for the function app in the Azure portal, which hosting plan should you select?

  • App Service (Dedicated) plan

  • Consumption plan

  • Isolated (App Service Environment) plan

  • Premium plan

Question 18 of 20

You are configuring an alert in Azure Monitor for an Application Insights resource. The alert must trigger as close to real time as possible and must not generate additional Log Analytics ingestion or query charges. Which signal type meets both requirements?

  • A scheduled query alert that runs the same KQL query every minute.

  • A log-based metric created from a KQL query on the Exceptions table.

  • A custom metric emitted by an Azure Function after reading telemetry from Logs.

  • The built-in pre-aggregated Application Insights metric.

Question 19 of 20

Your team is building an ASP.NET Core 6 Web API that will be secured by the Microsoft Identity platform. The API must respond with HTTP 401 when no bearer token is present and with HTTP 403 when the token does not contain the access_as_user scope. Which Program.cs configuration meets these requirements?

  • Call AddAuthentication(JwtBearerDefaults.AuthenticationScheme).AddJwtBearer(options => ); do not add additional authorization policies.

  • Call AddAuthentication(OpenIdConnectDefaults.AuthenticationScheme).AddMicrosoftIdentityWebApp(configuration.GetSection("AzureAd")) and enable PKCE.

  • Call AddMicrosoftIdentityWebApiAuthentication(configuration, "AzureAd") and set JwtBearerOptions.SuppressMapInboundClaims = true without configuring extra policies.

  • Call AddAuthentication(JwtBearerDefaults.AuthenticationScheme).AddMicrosoftIdentityWebApi(configuration.GetSection("AzureAd")); then add an authorization policy that requires the access_as_user scope.

Question 20 of 20

You run a build pipeline on an Azure VM that acts as a self-hosted Azure DevOps agent. The pipeline builds Docker images and must push them to a Standard tier Azure Container Registry (ACR) named contosoacr. You must use the least-privileged Azure-native authentication method available. What should you configure?

  • Create an Azure AD application and assign it the Owner role at the subscription scope; authenticate with its client secret.

  • Enable the admin user on contosoacr and use its username and password in the pipeline.

  • Enable anonymous pull access on contosoacr and use docker push without authentication.

  • Grant the VM's system-assigned managed identity the AcrPush role on contosoacr.