Skip to content Skip to sidebar Skip to footer

What Is The Highest Scale Of Time Precision That Can Be Reached Via Python?

Consider a very simple timer; start = time.time() end = time.time() - start while(end<5): end = time.time() - start print end how precise is this timer ? I mean compar

Solution 1:

This is entirely platform dependent. Use the timeit.default_timer() function, it'll return the most precise timer for your platform.

From the documentation:

Define a default timer, in a platform-specific manner. On Windows, time.clock() has microsecond granularity, but time.time()‘s granularity is 1/60th of a second. On Unix, time.clock() has 1/100th of a second granularity, and time.time() is much more precise.

So, on Windows, you get microseconds, on Unix, you'll get whatever precision the platform can provide, which is usually (much) better than 1/100th of a second.

Solution 2:

This entirely depends on the system you are running it on - there is no guarantee Python has any way of tracking time at all.

That said, it's pretty safe to assume you are going to get millisecond accuracy on modern systems, beyond that, it really is highly dependent on the system. To quote the docs:

Although this module is always available, not all functions are available on all platforms. Most of the functions defined in this module call platform C library functions with the same name. It may sometimes be helpful to consult the platform documentation, because the semantics of these functions varies among platforms.

And:

The precision of the various real-time functions may be less than suggested by the units in which their value or argument is expressed. E.g. on most Unix systems, the clock “ticks” only 50 or 100 times a second.

Post a Comment for "What Is The Highest Scale Of Time Precision That Can Be Reached Via Python?"