A company is rolling out sensor-based monitoring in facilities that have spotty connectivity. They implement edge computing to reduce delays when processing critical sensor readings. Which approach best addresses frequent data slowdowns to ensure operations continue without disruption?
Create a single advanced data center for all processing tasks
Use a cloud-based caching layer for sensor output while transmitting data to one central platform
Deploy multiple central regions with identical application stacks
Position local processing modules at each site to handle sensor output close to the source
Placing compute resources near where the data is generated (locally) ensures fast analysis and reduces dependence on distant facilities. A single central site, even if advanced, suffers from network bottlenecks. Adding a large caching layer does not handle immediate analysis. Replicating the setup in many central sites still leaves remote sensors vulnerable to connection delays. Local processing units allow continuous data handling even with inconsistent links, which illustrates the advantage of placing resources close to the source (Edge Computing).
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is edge computing and why is it beneficial?
Open an interactive chat with Bash
How does edge computing differ from cloud computing?