An organization obtains suspicious logs from multiple external data providers and wishes to combine them with local sources for deeper analysis. Which method is BEST for maintaining a scalable, adaptable process for adding new providers swiftly?
Archive all event logs in a shared folder system to be searched on demand
Adopt a unified platform that structures logs from external and internal sources under one flexible data model
Implement a scanning tool that queries each external data provider separately for suspicious indicators
Keep logs in separate repositories for each source and merge them manually when necessary
Utilizing a single platform that merges external and local logs in a flexible data structure supports advanced correlation and rapid integration of new sources. Storing each feed separately or relying entirely on export or manual searching introduces inefficiencies and inconsistent views of the data, limiting the ability to correlate events effectively.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What does a 'unified platform' mean in this context?
Open an interactive chat with Bash
Why is maintaining a flexible data model important for scalability?
Open an interactive chat with Bash
What are the disadvantages of separate log repositories for each source?