Java – why does switching from infinite loop to TimerTask cause CPU utilization to decrease?
I wrote a daemon with the following structure: @ h_ 502_ 2@while ( true ) { // do some stuff Thread.sleep( 1000 ); }
I noticed that a very large CPU was used – up to 100% I have a similar daemon on my production server for a few months with the same CPU problem
Yesterday I rewritten the code to use TimerTask I immediately noticed that the CPU utilization of my development box had decreased So I decided to deploy to production and use Munin for double checking Here is the chart:
what time?
There is absolutely no other operation on the production server except the JVM. > No other application thread runs > it absolutely executes old code at the right periodic intervals - I always write to the log whenever the thread executes
So why thread Sleep is less efficient than TimerTask?
Solution
There are three possibilities I can think of:
>You have a lot of threads doing this, they are context switching all the time Using a timer will mean only one thread On the other hand, it means you can only perform one task at a time. > You have a continuation; Somewhere in your loop before hibernation, so even if the main body of the loop doesn't work very often, that's it It's hard to say you didn't see some more specific code. > You have a broken JVM / OS combination It seems unlikely, indeed
A just executed thread The simple loop of sleep (1000) should be very cheap - it should also be easy to verify