00:20:00

Microsoft Azure Solutions Architect Expert Practice Test (AZ-305)

Use the form below to configure your Microsoft Azure Solutions Architect Expert Practice Test (AZ-305). The practice test can be configured to only include certain exam objectives and domains. You can choose between 5-100 questions and set a time limit.

Logo for Microsoft Azure Solutions Architect Expert AZ-305
Questions
Number of questions in the practice test
Free users are limited to 20 questions, upgrade to unlimited
Seconds Per Question
Determines how long you have to finish the practice test
Exam Objectives
Which exam objectives should be included in the practice test

Microsoft Azure Solutions Architect Expert AZ-305 Information

The Microsoft Azure Solutions Architect Expert AZ-305 exam is a pivotal certification for professionals who design and implement solutions on Microsoft's cloud platform. This exam validates a candidate's expertise in translating business requirements into secure, scalable, and reliable Azure solutions. Aimed at individuals with advanced experience in IT operations, including networking, virtualization, and security, the AZ-305 certification demonstrates subject matter expertise in designing cloud and hybrid solutions. Success in this exam signifies that a professional can advise stakeholders and architect solutions that align with the Azure Well-Architected Framework and the Cloud Adoption Framework for Azure.

The AZ-305 exam evaluates a candidate's proficiency across four primary domains. These core areas include designing solutions for identity, governance, and monitoring, which accounts for 25-30% of the exam. Another significant portion, 30-35%, is dedicated to designing infrastructure solutions. The exam also assesses the ability to design data storage solutions (20-25%) and business continuity solutions (15-20%). This structure ensures that certified architects possess a comprehensive understanding of creating holistic cloud environments that address everything from identity management and data storage to disaster recovery and infrastructure deployment.

The Strategic Advantage of Practice Exams

A crucial component of preparing for the AZ-305 exam is leveraging practice tests. Taking practice exams offers a realistic simulation of the actual test environment, helping candidates become familiar with the question formats, which can include multiple-choice, multi-response, and scenario-based questions. This familiarity helps in developing effective time management skills, a critical factor for success during the timed exam. Furthermore, practice tests are an excellent tool for identifying knowledge gaps. By reviewing incorrect answers and understanding the reasoning behind the correct ones, candidates can focus their study efforts more effectively on weaker areas.

The benefits of using practice exams extend beyond technical preparation. Successfully navigating these tests can significantly boost a candidate's confidence. As performance improves with each practice test, anxiety about the actual exam can be reduced. Many platforms offer practice exams that replicate the look and feel of the real test, providing detailed explanations for both correct and incorrect answers. This active engagement with the material is more effective than passive reading and is a strategic approach to ensuring readiness for the complexities of the AZ-305 exam.

Microsoft Azure Solutions Architect Expert AZ-305 Logo
  • Free Microsoft Azure Solutions Architect Expert AZ-305 Practice Test

  • 20 Questions
  • Unlimited
  • Design identity, governance, and monitoring solutions
    Design data storage solutions
    Design business continuity solutions
    Design infrastructure solutions
Question 1 of 20

You are designing a centralized logging solution for a company that runs several hundred Azure virtual machines, Azure Kubernetes Service clusters, and Azure SQL databases distributed across 20 subscriptions. Security policy requires that:

  • All platform and workload logs are queryable within five minutes of creation.
  • Log data must be retained for at least seven years to satisfy regulatory audits.
  • Administrators will use Kusto Query Language (KQL) to troubleshoot and create alert rules. Which approach meets the requirements while keeping administrative overhead low?
  • Send all diagnostic and activity logs to a single Log Analytics workspace configured for short-term retention, and enable Azure Monitor data export to an Azure Storage account that uses lifecycle policies for seven-year archival.

  • Deploy an Azure HDInsight cluster with Kafka to collect logs and store them in Azure Data Lake Storage, using Hive queries for reporting.

  • Create diagnostic settings on every resource to stream logs to Azure Event Hubs, then ingest the data into a third-party SIEM for storage and analysis.

  • Write logs directly to an Azure Storage account in the Cool tier and use Azure Synapse serverless SQL pool to query the data when needed.

Question 2 of 20

Contoso hosts a business-critical Azure SQL Database in the General Purpose tier. Compliance requires all full database backups to be kept for seven years. Administrators must also be able to restore the database to any point in time within the most recent 30 days with minimal management overhead. You need to recommend the most cost-effective native Azure solution that satisfies both requirements. Which approach should you recommend?

  • Configure a long-term retention (LTR) policy on the Azure SQL Database to store weekly full backups for seven years and use the service's automatic backups for 30-day point-in-time restore.

  • Create a Recovery Services vault and use Azure Backup to protect the Azure SQL Database with a seven-year retention policy.

  • Migrate the database to an Azure SQL Managed Instance in the Business Critical tier, enable auto-failover groups, and take weekly snapshots saved to Blob storage for seven years.

  • Schedule an Azure Automation runbook to export the database as weekly BACPAC files to Azure Storage and apply lifecycle rules to retain them for seven years.

Question 3 of 20

Contoso runs an Azure SQL Managed Instance in the business-critical tier. Compliance requires the company to be able to restore the database to any point in time within the last seven days and to keep one full backup taken on the first day of every month for five years. The solution must minimize cost and administrative effort. Which approach should you recommend?

  • Enable Azure Site Recovery to replicate the managed instance to a secondary region and rely on recovery points for both short-term and long-term restores.

  • Use the built-in automated backups for Azure SQL Managed Instance, set point-in-time retention to seven days, and configure a long-term backup retention policy that stores monthly full backups in Azure Blob storage for 60 months.

  • Create an Azure Backup Recovery Services vault and apply a SQL Server in Azure VM backup policy to protect the managed instance with daily and monthly backups retained for five years.

  • Develop an Azure Automation runbook that exports a BACPAC file of the database to Azure Storage each month for five-year retention and disable the service's automated backups.

Question 4 of 20

Tailwind Traders operates in Americas, EMEA, and APAC, and will create hundreds of Azure subscriptions per region. Compliance requires that deployments stay within region-specific Azure locations and that every resource automatically inherits a costCenter tag from its subscription. Regional IT staff must be able to add further policies and control costs for their own subscriptions without affecting other regions. You need to recommend a governance design that minimizes ongoing administrative effort. What should you recommend?

  • Enable resource locks on each subscription and use Azure RBAC deny assignments to restrict regions, while requiring governance scripts to add the costCenter tag manually.

  • Apply Azure Blueprints at the resource-group level inside each subscription to enforce allowed regions and tagging, without using management groups.

  • Create a root management group, then a management group for each region (Americas, EMEA, APAC). Assign an Azure Policy initiative containing Allowed locations and Inherit costCenter tag to each regional management group and place regional subscriptions beneath them.

  • Create a single management group that holds all subscriptions and assign the Allowed locations and Inherit costCenter tag policies separately to every subscription with region-specific parameters.

Question 5 of 20

A company runs two Windows Server 2019 Azure virtual machines that host a stateless REST API. The VMs are deployed in a single availability set in the West Europe region and are front-ended by an Azure Load Balancer. A recent datacenter-wide outage in West Europe caused both VMs to become unavailable, violating the business requirement of a 99.99 percent uptime SLA for the compute tier. The solution must continue to use only the West Europe region and must not require changes to the application code. Which change should you recommend to meet the availability requirement?

  • Redeploy the application to an Azure Virtual Machine Scale Set configured for zone redundancy across three Availability Zones in West Europe.

  • Increase the fault domain count of the existing availability set from 2 to 3 to improve fault tolerance.

  • Enable Azure Site Recovery to replicate the two VMs to another region and configure automatic failover.

  • Add an Azure Standard Load Balancer in front of the existing availability set to distribute traffic evenly across the two VMs.

Question 6 of 20

Your company ingests two data streams from a consumer IoT product. Devices send about 5 GB/hour of JSON telemetry that dashboards must query for the last seven days with sub-100 ms latency and allow flexible schema changes. Devices also upload 2 MB JPEG images that are accessed often for 30 days, seldom after, but must be retained for five years. To meet requirements at the lowest cost and administration effort, which Azure storage combination should you recommend?

  • Azure Data Lake Storage Gen2 to store both telemetry and images in a single storage account with hierarchical namespace enabled

  • Azure Cosmos DB (NoSQL) with autoscale throughput for telemetry, and Azure Blob Storage with lifecycle rules to move images from the Hot tier to Cool after 30 days and to Archive after 180 days

  • Azure SQL Database Hyperscale for telemetry and Azure Files with the Cool access tier for images

  • Azure Cache for Redis to store telemetry and zone-redundant Premium SSD managed disks for images

Question 7 of 20

You are preparing a migration plan for 200 on-premises VMware virtual machines that host a three-tier .NET application and a back-end SQL Server farm. You must determine which virtual machines can move to Azure with no remediation, obtain performance-based Azure VM size recommendations, visualize traffic dependencies to create migration groups, and generate an estimated monthly cost for running the workloads in Azure. Which Azure service or tool should you use to gather all of this information?

  • Create an Azure Migrate project and run the Discovery and assessment tooling with dependency visualization enabled.

  • Run the Azure Total Cost of Ownership (TCO) Calculator for the environment and map inter-VM traffic by using Service Map.

  • Configure Azure Site Recovery for the virtual machines and use Azure Cost Management + Billing to project expenses.

  • Export performance counters from vCenter to Azure Monitor and analyze them by using Azure Monitor Workbooks.

Question 8 of 20

A factory ingests 5 million JSON telemetry events per hour (~15 GB), each with 50-100 varying properties. Engineers need ad-hoc queries on the latest seven days, returning results in under two seconds when filtering by deviceId and any property. Data older than 30 days moves to Azure Data Lake Storage. For hot data, select a storage service that 1) auto-indexes all JSON properties, 2) offers single-digit-millisecond read/write latency globally with SLA, and 3) provides elastic throughput with minimal admin. Which service meets these needs?

  • Azure Data Lake Storage Gen2 with hierarchical namespace enabled

  • Azure Table storage with partition and row key based on deviceId and timestamp

  • Azure SQL Database Hyperscale tier with JSON columns and secondary indexes

  • Azure Cosmos DB using the Core (SQL) API with provisioned or autoscale throughput

Question 9 of 20

Your organization keeps hourly clickstream JSON files in an Azure Data Lake Storage Gen2 container. Data engineers must ingest the files, flatten nested JSON arrays, perform several joins, and load the results into Azure SQL Database every hour. They want a managed, code-free service that can orchestrate the workflow, provide visual data-lineage, automatically spin up and shut down compute clusters, and integrate with Azure Monitor-while minimizing operational overhead. Which Azure service best meets these requirements?

  • An Azure Stream Analytics job that processes the JSON files with SQL-style queries

  • Azure Functions triggered by Event Grid that call stored procedures to load data

  • Azure Data Factory Mapping Data Flows executed on managed Spark clusters

  • An Azure Logic Apps workflow that moves files from ADLS Gen2 to Azure SQL Database

Question 10 of 20

Your company plans to onboard several new Azure subscriptions for different business units. The security team must ensure that each subscription meets NIST SP 800-53 controls, bundles the required Azure Policy assignments with specific role assignments and a standardized resource hierarchy, supports versioning so changes can flow through dev, test, and production management groups, and can be assigned in a single step. Which Azure service should you recommend?

  • Azure Advisor recommendations

  • An Azure Policy initiative assigned at the root management group

  • Microsoft Defender for Cloud regulatory compliance dashboard

  • Azure Blueprints

Question 11 of 20

A retail company is modernizing its Azure-hosted e-commerce platform. When a new customer account is created, three independent microservices (email confirmation, resource provisioning, and analytics) must react to the event. The solution must: provide loose coupling with fan-out to multiple subscribers; support attribute-based filtering; guarantee at-least-once delivery; allow easy addition of future Azure service event sources such as Blob Storage; and require minimal infrastructure management. Which Azure service should you recommend as the core event distribution mechanism?

  • Azure Notification Hubs

  • Azure Service Bus topic with multiple subscriptions

  • Azure Event Hubs Standard namespace

  • Azure Event Grid using a custom topic

Question 12 of 20

You administer two Windows Server 2022 Azure virtual machines that run a mission-critical workload in an availability set. The business dictates that, in the event of data corruption, you must be able to recover either the whole VM or individual files. The recovery point objective (RPO) for workload data must not exceed 15 minutes, and the solution must rely only on native Azure capabilities with minimal operational overhead. Which approach should you recommend?

  • Create an Azure Automation runbook that triggers managed disk incremental snapshots every 15 minutes and copies them to Azure Storage.

  • Configure Azure Site Recovery (ASR) to continuously replicate the virtual machines to a secondary Azure region.

  • Enable Azure Backup on the virtual machines with the shortest (hourly) backup schedule.

  • Enable Azure Site Recovery for the virtual machines and also enable Azure Backup with a daily backup policy.

Question 13 of 20

A company hosts a media-streaming web app in the East US 2 region. User-uploaded videos are stored in an Azure general-purpose v2 storage account that currently uses locally redundant storage (LRS). After a recent region-wide outage, the CTO asks you to redesign the storage layer to achieve the following goals:

  • Videos must remain available for read access if the entire East US 2 region becomes unavailable.
  • Write operations must continue to be directed only to East US 2 to keep latency low.
  • The solution should meet or exceed a 99.99 percent SLA for read availability while keeping costs lower than options that add both zone and geo redundancy.

Which Azure Storage redundancy option should you recommend?

  • Zone-redundant storage (ZRS) in the East US 2 region

  • Read-access geo-redundant storage (RA-GRS)

  • Geo-zone-redundant storage (GZRS)

  • Geo-redundant storage (GRS)

Question 14 of 20

Your company runs a mission-critical line-of-business application that uses an Azure SQL Database single database in the West Europe region. Management requires the database tier to provide at least 99.995 percent availability within the region and to fail over automatically with zero data loss during any planned or unplanned outage. Operational overhead must remain minimal, and the application connection string must not change. Which solution should you recommend?

  • Migrate to SQL Server on Azure virtual machines in an availability set with a SQL Server Failover Cluster Instance.

  • Switch to the General Purpose service tier and configure an auto-failover group between West Europe and North Europe.

  • Move the database to the Business Critical service tier and enable zone-redundant configuration.

  • Enable active geo-replication to a secondary database in North Europe.

Question 15 of 20

A company is modernizing an on-premises microservices application and plans to deploy containers to Azure. Mandatory requirements are:

  • Each microservice must automatically scale from 0 to hundreds of instances on HTTP or queue events.
  • Platform management of the underlying infrastructure must be minimized; developers must not administer Kubernetes clusters.
  • Built-in support is needed for Dapr service invocation and pub/sub between microservices. Which Azure service should you recommend to host the containers?
  • Azure Container Apps

  • Azure Kubernetes Service (AKS) with the cluster autoscaler and KEDA add-on

  • Azure Container Instances orchestrated by Azure Logic Apps

  • Azure App Service for Containers running in the Premium v3 tier with autoscale rules

Question 16 of 20

Your company maintains 10,000 user accounts in several on-premises Active Directory forests and has multiple Azure subscriptions. The organization wants every user to have a single cloud identity while keeping their existing on-premises credentials. You must retire the current AD FS farm, avoid opening any additional inbound firewall ports, and enable Azure AD Conditional Access and multifactor authentication for Microsoft 365 and custom SaaS applications. Which identity management approach should you recommend?

  • Maintain AD FS federation and publish the AD FS farm through Azure AD Application Proxy.

  • Deploy Azure AD Domain Services and join all Azure virtual machines to the managed domain.

  • Configure Azure AD Password Hash Synchronization and disable on-premises authentication.

  • Implement Azure AD Pass-through Authentication with Seamless Single Sign-On and retire the AD FS infrastructure.

Question 17 of 20

A financial-services company runs its core workload in a single Azure SQL Database in the General Purpose tier. Administrators must be able to restore the database to any point in time within the last 30 days and also recover a full backup from any week during the next 10 years. The solution must require minimal ongoing management and incur the lowest possible storage cost. Which Azure SQL capability should you recommend?

  • Use Azure Backup to protect the database by installing the Azure Backup agent and scheduling weekly backups for 10 years.

  • Create an Automation runbook that exports the database as a BACPAC file to Azure Storage every week and deletes files older than 10 years.

  • Enable a long-term backup retention (LTR) policy on the Azure SQL Database to store weekly full backups in Azure Blob Storage for up to 10 years.

  • Configure active geo-replication to a secondary database in another region and keep the replica online for 10 years.

Question 18 of 20

A media company must migrate 800 TB of unstructured video files that currently reside on an on-premises NAS hosted in a secure, offline network segment. The WAN link to Azure is limited to 200 Mbps and is shared with other workloads, so the company wants to avoid saturating it. The migration must be completed within four weeks, and after the data is transferred there will be no further synchronization requirements. Which Azure service should you recommend to perform the migration?

  • Use AzCopy to copy the data over the existing 200 Mbps ExpressRoute circuit directly into Blob Storage.

  • Order an Azure Data Box Heavy appliance for an offline bulk transfer to Azure Blob Storage.

  • Implement Azure File Sync to replicate the NAS shares to an Azure file share and then tier cold data to the cloud.

  • Deploy Azure Data Box Gateway as a virtual appliance onsite and let it stream data continuously to Blob Storage.

Question 19 of 20

Your company has a tenant with a production and a development subscription. Security requirements mandate that all production storage accounts use only General Purpose v2 with customer-managed keys, while development is exempt. You need a solution that enforces the requirement automatically across any current or future production subscriptions and provides a single compliance report. What should you recommend?

  • Assign an Azure Policy initiative to the production management group and a separate initiative to the development management group.

  • Create an Azure Blueprint that includes the policy and apply the blueprint to each subscription.

  • Create a custom RBAC role that denies creation of non-compliant storage accounts and assign it at the subscription level.

  • Assign individual policy definitions to each storage resource group in every subscription.

Question 20 of 20

You administer an on-premises SQL Server 2016 Always On availability group that hosts a 4-TB transactional database generating about 50 GB of new data daily. The business has approved migrating this workload to Azure SQL Managed Instance. The migration must allow thorough pre-cutover testing, keep the source database online during data movement, and limit the final cutover outage to less than 30 minutes. You want to minimize custom scripting and use Microsoft-provided tooling. Which migration approach should you recommend?

  • Perform an online migration with Azure Database Migration Service, enabling continuous replication until the planned cutover.

  • Implement log shipping to a SQL Server VM in Azure, then upgrade the VM in place to Managed Instance using the Azure portal.

  • Export a BACPAC using Azure Data Migration Assistant, then import the file into the managed instance.

  • Configure on-premises SQL Server as a publisher and Azure SQL Managed Instance as a transactional replication subscriber, then switch applications over.