A medical research team uses short-lived container instances to process large genomic files. The data generated by each container is wiped out on redeployment. Which method will help the team hold on to important results after every container is replaced?
Use environment variables to store each container's results
Place the results in memory caches using each container's network settings
Include the results in the container image so they persist through restarts
Attach a file system that is maintained beyond container termination
Placing the data on a file system that is external to the container life cycle ensures it is not destroyed when the container is replaced. Environment variables are suited to configuration values, not storing large sets of results. Embedding data in a container image requires rebuilding the image each time and is inefficient. Memory-based solutions lose data when processes restart, making them unreliable for retaining critical data.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does it mean to 'attach a file system' to a container?
Open an interactive chat with Bash
Why are environment variables not suitable for storing large sets of results?
Open an interactive chat with Bash
What are some examples of memory-based solutions, and why aren't they reliable for storing critical data?