00:20:00

Microsoft Azure Developer Associate Practice Test (AZ-204)

Use the form below to configure your Microsoft Azure Developer Associate Practice Test (AZ-204). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

Logo for Microsoft Azure Developer Associate AZ-204
Questions
Number of questions in the practice test
Free users are limited to 20 questions, upgrade to unlimited
Seconds Per Question
Determines how long you have to finish the practice test
Exam Objectives
Which exam objectives should be included in the practice test

Microsoft Azure Developer Associate AZ-204 Information

The Microsoft Azure Developer Associate (AZ-204) certification is a crucial credential for cloud developers specializing in the Microsoft Azure ecosystem. This exam is designed for professionals who are responsible for all phases of the development lifecycle, including gathering requirements, design, development, deployment, security, maintenance, performance tuning, and monitoring. Candidates should have 1-2 years of professional development experience, including hands-on experience with Microsoft Azure. The exam validates a developer's proficiency in leveraging Azure's tools, SDKs, and APIs to build and maintain cloud applications and services.

The AZ-204 exam assesses a broad set of skills across five primary domains. These areas include developing Azure compute solutions (25-30%), developing for Azure storage (15-20%), implementing Azure security (15-20%), monitoring, troubleshooting, and optimizing Azure solutions (5-10%), and connecting to and consuming Azure services and third-party services (20-25%). The exam itself consists of 40-60 questions and has a duration of about 100 minutes. The question formats can vary, including multiple-choice, scenario-based questions, and drag-and-drop tasks.

The Value of Practice Exams in Preparation

A critical component of a successful study plan for the AZ-204 exam is the use of practice tests. Taking practice exams offers several key benefits that go beyond simply memorizing facts. They help you become familiar with the style, wording, and difficulty of the questions you are likely to encounter on the actual exam. This familiarity can help reduce anxiety and improve time management skills during the test.

Furthermore, practice exams are an excellent tool for self-assessment. They allow you to gauge your readiness, identify areas of weakness in your knowledge, and focus your study efforts accordingly. By reviewing your answers, especially the incorrect ones, you can gain a deeper understanding of how different Azure services work together to solve real-world problems. Many candidates find that simulating exam conditions with timed practice tests helps build the confidence needed to think clearly and methodically under pressure. Microsoft itself provides a practice assessment to help candidates prepare and fill knowledge gaps, increasing the likelihood of passing the exam.

Microsoft Azure Developer Associate AZ-204 Logo
  • Free Microsoft Azure Developer Associate AZ-204 Practice Test

  • 20 Questions
  • Unlimited
  • Develop Azure compute solutions
    Develop for Azure storage
    Implement Azure security
    Monitor and troubleshoot Azure solutions
    Connect to and consume Azure services and third-party services
Question 1 of 20

You are importing an OpenAPI 3.0 document to create a new API in Azure API Management. The specification contains operation-level examples, tags, summary text, and an externalDocs object. After the API is created, you will publish it through the built-in developer portal. Which statement accurately describes how the information from the specification is used in the portal?

  • externalDocs links are automatically transformed into policy snippets rendered in the portal's test console.

  • Operation examples defined in the specification are automatically displayed as sample requests and responses for each operation in the developer portal.

  • Tags on each operation are converted into separate API versions visible in the portal's version selector.

  • The summary and description elements are ignored during import and must be added manually in API Management before they appear in the portal.

Question 2 of 20

Your team added a URL ping availability test to Application Insights for an Azure web app. Whenever the metric Failed locations crosses the threshold of 0, the on-call engineer must get an SMS and the help-desk distribution list must get an email. Which configuration should you implement to satisfy this requirement without changing application code?

  • Configure an Azure Event Grid subscription on the Failed locations metric and trigger a Logic App that sends the SMS and email notifications.

  • In the availability test settings, list the engineer's phone number and the help-desk email address in the Alert recipients section.

  • Add the engineer and help-desk distribution list as owners of the Application Insights resource so they receive automatic alert emails.

  • Create an action group containing two Email/SMS/Push/Voice receivers (one SMS, one email) and associate that action group with a metric alert on the Failed locations metric.

Question 3 of 20

You are deploying a new Azure API Management (APIM) instance by using an ARM template. The template contains the following snippet:

"sku": {
  "name": "Consumption",
  "capacity": 2
}

After running the deployment, the template validation fails with the error "Property capacity is not allowed".

To complete the deployment successfully while keeping the billing model that charges per execution and scales automatically, what should you do?

  • Remove the capacity property (or set it to 0) and redeploy the Consumption tier.

  • Change the sku.name value to Developer and keep capacity set to 1.

  • Keep the sku settings and add the property "autoScale": "enabled" to the template.

  • Change the sku.name value to Basic and keep capacity set to 2.

Question 4 of 20

You run an on-demand data-processing task in Azure Container Instances. The task completes in roughly 15 minutes. After a successful run (exit code 0), the container should stop so that billing ends. If the task fails (non-zero exit code), the container must restart automatically. Which restart policy should you set when creating the container group?

  • UnlessStopped

  • Always

  • OnFailure

  • Never

Question 5 of 20

You develop an Azure Function that uploads telemetry files to the incoming container of an Azure Storage account. The function runs daily between 02:00 and 03:00 UTC. Security policy mandates a service SAS that lets the function create or overwrite blobs without listing the container and is valid only during the execution window. Which SAS token meets these requirements?

  • sp=rl&st=2024-10-15T00:00Z&se=2024-10-15T23:59Z

  • sp=cw&st=2024-10-15T02:00Z&se=2024-10-15T03:00Z

  • sp=cwdl&st=2024-10-15T02:00Z&se=2024-10-15T03:00Z

  • sp=wl&st=2024-10-15T02:00Z&se=2024-10-15T03:00Z

Question 6 of 20

You are preparing to deploy a new Azure API Management instance for a production workload. The instance must join an Azure virtual network in internal mode so it can reach on-premises systems, allow you to add additional capacity units later without redeployment, and use custom hostnames for the gateway and developer portal. Which pricing tier should you choose when you create the instance?

  • Standard tier

  • Developer tier

  • Basic tier

  • Premium tier

Question 7 of 20

You plan to create a new Azure API Management (APIM) instance in the Azure portal. During the first page of the creation wizard, you must enter several values. After the resource is provisioned, which setting cannot be modified without recreating the APIM service?

  • The organization (publisher) name shown in the developer portal

  • The pricing tier (SKU) assigned to the instance

  • The service name that forms the *.azure-api.net public endpoint

  • The administrator email address used for notifications

Question 8 of 20

You need to provision a new Azure App Service Web App named "contoso-api" for a .NET 7.0 application. The web app must run on Linux and be placed in the existing resource group "RG1" and the existing App Service plan "asp-linux" (located in the same resource group). Which Azure CLI command meets the requirements?

  • az webapp create --resource-group RG1 --plan asp-linux --name contoso-api --runtime "DOTNET:7.0" --os-type Linux

  • az webapp create --resource-group RG1 --name contoso-api --runtime "DOTNETCORE|7.0" --os-type Linux

  • az webapp create --resource-group RG1 --plan asp-linux --name contoso-api --runtime "DOTNETCORE|7.0" --os-type Linux

  • az webapp create --resource-group RG1 --plan asp-linux --name contoso-api --runtime "DOTNETCORE|7.0" --os-type Windows

Question 9 of 20

You are building a .NET application that uses the Azure.Storage.Blobs v12 SDK. Before processing an existing block blob you must determine its size (ContentLength) and read the value of a custom metadata key named "department." You must accomplish this without downloading the blob's content and while making as few service calls as possible. Which SDK action should you take?

  • Call BlobClient.GetTagsAsync and read the Tags dictionary.

  • Call BlobClient.GetPropertiesAsync and read the returned BlobProperties values.

  • Call BlobClient.DownloadContentAsync and inspect the BlobDownloadResult.

  • Call BlobClient.SetMetadataAsync with an empty dictionary, then read the response headers.

Question 10 of 20

You manage an Azure API Management instance that exposes several APIs to tenants who authenticate with a subscription key. The solution must block a tenant that makes more than 50 requests within any 60-second window, but never throttle requests originating from the corporate VNet (10.0.0.0/16). Which policy configuration meets the requirement?

  • Configure a validate-subscription policy in the outbound section and use a header transform rule to return HTTP 429 after 50 calls.

  • Apply a rate-limit policy with calls set to 50 and renewal-period set to 60 seconds in the backend section of the API.

  • Add an ip-filter policy that allows 10.0.0.0/16, followed by a rate-limit-by-key policy with calls set to 50, renewal-period set to 60 seconds, and counter-key set to the subscription key.

  • Add an ip-filter policy that allows 10.0.0.0/16, followed by a quota-by-key policy with calls set to 50, renewal-period set to 60 seconds, and counter-key set to the subscription key.

Question 11 of 20

A container named Transactions already contains several years of data. A background .NET service must forward only new inserts and updates to Azure Service Bus. The solution must 1) ignore existing items, 2) resume from the last processed change after restarts, and 3) allow multiple service instances without duplicate processing. Which implementation meets the requirements?

  • Use the Azure Cosmos DB Change Feed Processor SDK, set ChangeFeedStartFrom.Now, and store leases in a dedicated lease container.

  • Create a change-feed iterator with ChangeFeedStartFrom.Beginning and keep continuation tokens in memory.

  • Configure a 1-second TTL on the container and process deleted documents from the change feed.

  • Enable full-fidelity change feed mode and poll the container every minute for the latest _lsn values.

Question 12 of 20

When using the Azure.Storage.Blobs v12 .NET SDK, you need to read the custom metadata you previously added to a block blob without downloading any of the blob data. Which BlobClient method should you call to obtain the metadata in the response?

  • GetPropertiesAsync

  • DownloadContentAsync

  • SetMetadataAsync

  • GetTagsAsync

Question 13 of 20

You are developing a background service in C# that must react to inserts and updates in an Azure Cosmos DB container named Orders. The service must guarantee that each document is processed exactly once, even after the process restarts, and it must automatically redistribute the workload when you scale the service out to multiple instances. Which implementation should you use to meet both requirements?

  • Create a ChangeFeedProcessor instance, assign a separate lease container, and register the processor to handle changes.

  • Add a SQL API pre-trigger that calls an Azure Function on every insert and update to Orders.

  • Enable analytical store on the Orders container and periodically query items where the _ts value is greater than the last processed timestamp.

  • Use GetChangeFeedIterator starting from the beginning of time and keep the continuation token in application memory.

Question 14 of 20

An ASP.NET Core web app is hosted in Azure App Service. The app reads its settings from Azure App Configuration. Several settings are stored as Key Vault references, for example:

ConnectionStrings--Sql = @Microsoft.KeyVault(SecretUri=https://contosokv.vault.azure.net/secrets/SqlConnection)

When the application starts, it throws a Forbidden (403) error while trying to resolve the reference. You must fix the issue without storing any credentials in code or in App Configuration.

What should you do?

  • Enable diagnostic logging for the Key Vault and send the logs to Application Insights.

  • Enable a system-assigned managed identity for the App Service resource and assign it the Key Vault Secrets User role on the vault.

  • Grant the App Service resource the App Configuration Data Reader role on the Key Vault.

  • Store an Azure AD application client secret in App Configuration and have the code read that secret to access Key Vault.

Question 15 of 20

An Azure Storage account named contosoimages hosts a container for customer uploads. You are developing an ASP.NET Core Web API that must return a six-hour SAS URL so clients can upload one blob. Security requires Azure AD authentication only, no storage account keys, and the token must be revocable by disabling the API's identity. Which approach should you implement?

  • Generate a user delegation SAS by calling GetUserDelegationKey and building the SAS with BlobSasBuilder.

  • Set the container access level to Blob and return the blob URL without a SAS.

  • Generate a service SAS by signing BlobSasBuilder with the storage account key.

  • Generate an account SAS in the Azure portal and store it in Azure Key Vault.

Question 16 of 20

You need to deploy a Linux-based Azure Container Instance (ACI) that will pull the image api-backend:1.0 from the private Azure Container Registry myacr.azurecr.io. Company policy forbids storing registry usernames or passwords in deployment scripts, so the ACI must authenticate to the registry by using a previously created user-assigned managed identity. When you run az container create, which additional parameter satisfies this requirement?

  • --assign-identity /subscriptions//resourceGroups/rg1/providers/Microsoft.ManagedIdentity/userAssignedIdentities/aci-pull-id

  • --use-acr-credential

  • --registry-username $(ACR_USER) --registry-password $(ACR_PASS)

  • --secure-context-type acr

Question 17 of 20

You are designing a microservice that stores user shopping carts in an Azure Cosmos DB account configured with multiple read regions. Each user must always see their most recent updates immediately after they are written, but global read latency must be as low as possible and cross-user strong consistency is unnecessary. Which consistency level meets these requirements?

  • Bounded staleness

  • Consistent prefix

  • Strong

  • Session

Question 18 of 20

You manage an Azure API Management instance that protects its operations with Azure AD-issued JWT bearer tokens. Compliance requires that every tenant, identified by the tenantId claim inside each token, be limited to at most 1 000 calls per one-hour period across the entire API. Other tenants must not be affected by a busy tenant's traffic. Which inbound policy should you implement, and how should you configure it to meet the requirement?

  • Insert a quota-by-key policy with calls="1000", renewal-period="3600", counter-key="@(context.Request.Headers["tenantId"])", applied at the API scope.

  • Insert a rate-limit-by-key policy with calls="1000", renewal-period="3600", counter-key="@(context.Principal.Claims["tenantId"].Value)", applied at the API scope.

  • Declare a set-variable policy that stores tenantId, followed by a quota policy referencing that variable to cap requests at 1 000 per hour.

  • Insert a rate-limit policy with calls="1000", renewal-period="3600" at the product scope; no key is needed because the policy counts per caller automatically.

Question 19 of 20

You register a single-tenant web API named ContosoApi in Microsoft Entra ID. A separate daemon application will call the API by using the client-credentials grant. The API must authorize calls only when the incoming access token contains the role Orders.ReadWrite and there is no user context. Which configuration should you perform for ContosoApi in the Azure portal?

  • Create an Azure RBAC role assignment granting the client application Contributor access to the ContosoApi App Service.

  • Create an application role named Orders.ReadWrite in ContosoApi and assign that role to the client application's service principal.

  • Define a delegated permission scope named Orders.ReadWrite in ContosoApi and require admin consent for the client application.

  • Add optional JWT claims for roles in ContosoApi and mark the claim as essential.

Question 20 of 20

An Azure App Service Web App you manage must comply with security standards that prohibit TLS 1.0 and TLS 1.1. You have already uploaded a valid TLS certificate and bound it to your custom domain. Which portal configuration will ensure the app only accepts connections that negotiate TLS 1.2 or later?

  • Add the sslFlags attribute to the web.config file to require SSL negotiation.

  • Create a new HTTPS listener in Azure Application Gateway that allows only the TLS1_2 protocol.

  • Set Minimum TLS Version to 1.2 in the TLS/SSL settings blade of the App Service.

  • Add an application setting named WEBSITE_DISABLE_TLS12 with a value of true.