A systems administrator is troubleshooting a performance issue with a critical, multi-threaded database application running on a 32-core server. Users report that complex queries are taking much longer than usual to complete. The administrator observes that overall CPU utilization is below 20%, but two specific CPU cores are constantly at 100% utilization whenever the application is running a task. Other applications on the server are performing normally. What is the MOST likely cause of this performance bottleneck?
A memory leak within the application is consuming system resources.
An incompatible device driver was recently installed on the server.
The server's CPU is thermal throttling due to a cooling failure.
The application's CPU affinity has been improperly configured.
The correct answer is that the application's CPU affinity has been improperly configured. CPU affinity, or processor pinning, is a setting that restricts a process to run on specific CPU cores. In this scenario, the multi-threaded database application, which is designed to use multiple cores, is being forced to run on only two cores. This causes those two cores to become saturated at 100% utilization, creating a bottleneck, while the other 30 cores remain underutilized. This directly matches the symptoms described.
A memory leak would manifest as continually increasing RAM usage, eventually leading to system-wide sluggishness or crashes, not high utilization isolated to specific CPU cores.
CPU thermal throttling is a protective measure where the CPU slows itself down to prevent overheating. This would typically affect the performance of all cores, not cause a few to run at 100% while others are idle.
An incompatible device driver is more likely to cause system instability, errors, or crashes (like a Blue Screen of Death) rather than this specific CPU utilization pattern.