During a peer review, auditors discover that they cannot rebuild a credit-risk model from the project's repository because the specific versions of Python libraries and system packages used during training are unknown. The current repository already contains a detailed model card, a comprehensive data dictionary, well-commented training notebooks, and a versioned change log. Which additional documentation artifact would most directly resolve the auditors' reproducibility concern?
A statistical quality-control chart that tracks prediction latency trends for the deployed API.
A model interpretability report summarizing SHAP value distributions for the most important features.
A system architecture diagram illustrating container orchestration and network endpoints in production.
A dependency lockfile or environment specification that captures exact library and system package versions.
Reproducing a model end-to-end requires an exact snapshot of the software environment in which it was trained. A dependency lockfile or environment specification (such as a requirements.txt, conda.yaml, or Dockerfile) enumerates every library and version needed to recreate that environment, eliminating dependency drift. Interpretability reports, architecture diagrams, and latency dashboards all convey valuable information, but none of them record the precise package versions necessary to rebuild and rerun the training pipeline, so they do not solve the auditors' core problem.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is a dependency lockfile or environment specification?
Open an interactive chat with Bash
Why can't models be reliably reproduced without a dependency lockfile?
Open an interactive chat with Bash
How is a dependency lockfile different from a model card or data dictionary?