Java – from system What is the worst resolution that nanotime can expect?

I'm writing software that requires microsecond resolution or better timestamps

I'm planning to use system Currenttimemillis and system The combination of nanotime, although this is only a rough code sketch:

private static final long absoluteTime = (System.currentTimeMillis() * 1000 * 1000);
private static final long relativeTime = System.nanoTime();

public long getTime()
{
    final long delta = System.nanoTime() - relativeTime;
    if (delta < 0) throw new IllegalStateException("time delta is negative");
    return absoluteTime  + delta;
}

The nanotime document says:

So there is no guarantee of a better solution than milliseconds

Further, under the nanotime engine (this can be foreseen as a native method):

>Windows uses the promised QueryPerformanceCounter API with a resolution of less than 1 microsecond, which is great. > Linux uses clock with flags_ Gettime to ensure that its value is monotonous, but there is no commitment to the resolution Solaris is similar to Linux source code, and does not mention how OS X or UNIX based operating systems handle it

(source)

I have seen some vague hints that it will "usually" have microsecond resolution, such as this answer another question:

But without a source, the word "usually" is very subjective

Question: under what circumstances will nanotime return a value with a resolution worse than microseconds? For example, perhaps the major operating system version does not support it, or requires specific hardware functions, which may not exist Please try to provide information if you can

I'm using java 1.6, but if it's good for this problem, I have little chance to upgrade

Solution

There seems to be a lot of requirements for a list of all situations that may be subject to this constraint. No one knows in which environment your software will run But in order to prove that this may happen, you can see this blog post by Aleksey shipilev, where he describes a case in which the nano time becomes less accurate (in terms of its own delay) than microseconds on Windows machines due to contention

In another case, the software running under the virtual machine simulates the hardware clock in a very rough way

The specification has been deliberately obscured due to platform - and hardware - specific behavior

Once you verify that the hardware and operating system you use provide the content you need, and the VM passes the necessary functions, you can "reasonably expect" microsecond accuracy

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>