I\'m speaking specifically about a C# .NET 4 program running on Windows XP or hi
ID: 644872 • Letter: I
Question
I'm speaking specifically about a C# .NET 4 program running on Windows XP or higher, but general answers are also acceptable.
Assume an already optimized and efficient program. The problem here is entirely down to effects of high CPU usage on hardware, and whether a high-usage program should be throttled to reduce wear, not on whether my implementation is efficient.
A colleague today suggested that I should not aim for 100% CPU utilization on my data load processes because "modern CPUs are cheap and will degrade quickly at 100% CPU".
Is this true? And if so, why? I was previously under the impression that 100% CPU usage was preferable for an intensive or long operation, and I couldn't find any respectable sources on the subject either way.
Explanation / Answer
If cooling is insufficient, the CPU might overheat. But they all (well, at least all modern PC CPUs) feature various thermal protection mechanisms which will throttle the clock speed or, as a final resort, shut down.
So yes, on a dusty laptop, 100 % CPU load could cause temporary problems, but nothing will break or "degrade" (whatever that means).
For CPU bound problems, 100 % CPU load is the right way to go.
As for application (UI) responsiveness, that's a separate concept from CPU utilization. It's entirely possible to have an unresponsive application that uses 1 % CPU, or responsive application that uses 100 % CPU. UI responsiveness boils down to amount of work done in the UI thread, and the priority of UI thread vs. other threads.