AWS Certified Solutions Architect Professional SAP-C02 Practice Question
A financial services company is migrating its on-premises data center to AWS. The migration includes a 500 TB on-premises NAS that stores critical financial analytics data. The company has an existing 1 Gbps AWS Direct Connect connection, which is currently utilized at 40% capacity for other business operations. The project timeline requires the initial 500 TB data transfer to be completed within 30 days. After the initial transfer, a subset of the data, approximately 50 TB, will continue to be updated on-premises and requires ongoing synchronization with the target Amazon S3 bucket until the final application cutover in three months.
The company's security policy mandates end-to-end encryption for all data in transit. A solutions architect needs to design the most efficient and cost-effective migration strategy that meets these requirements.
Which approach should the architect recommend?
Use AWS Snowball Edge Storage Optimized devices for the initial 500 TB transfer. Then, use an AWS DataSync agent on-premises to perform ongoing synchronization over the Direct Connect link.
Deploy an AWS Storage Gateway in File Gateway mode on-premises. Use AWS DataSync to migrate the entire 500 TB of data from the NAS to the File Gateway to be uploaded to Amazon S3.
Use AWS Snowball Edge Storage Optimized devices for the initial bulk transfer. For ongoing synchronization, configure an AWS Transfer Family SFTP endpoint and use a scheduled script to sync changes.
Use AWS DataSync to transfer the entire 500 TB dataset over the Direct Connect connection. Schedule the DataSync task to run continuously until the migration is complete.
The correct answer is to use AWS Snowball Edge for the initial bulk transfer and AWS DataSync for the ongoing synchronization. A 1 Gbps connection has a theoretical maximum throughput of about 10.8 TB per day. With only 60% of the bandwidth available (600 Mbps), the effective throughput is approximately 6.5 TB per day. Transferring 500 TB would take roughly 77 days, which does not meet the 30-day requirement. Therefore, an offline transfer method like AWS Snowball Edge is necessary for the initial bulk migration. AWS Snowball Edge Storage Optimized devices are suitable for petabyte-scale transfers. For the ongoing synchronization of the 50 TB active dataset, AWS DataSync is the ideal managed service. It can operate over the existing Direct Connect link, fully automates the incremental data transfer, provides end-to-end encryption, and validates data integrity, making it more efficient and less operationally complex than custom-scripted solutions.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is AWS Direct Connect and how does it support migration projects?
Open an interactive chat with Bash
What is AWS Snowball Edge, and why is it used for bulk data transfers?
Open an interactive chat with Bash
How does AWS DataSync ensure secure and efficient ongoing data synchronization?
Open an interactive chat with Bash
AWS Certified Solutions Architect Professional SAP-C02
Accelerate Workload Migration and Modernization
Your Score:
Report Issue
Bash, the Crucial Exams Chat Bot
AI Bot
Loading...
Loading...
Loading...
IT & Cybersecurity Package Join Premium for Full Access