time.clock()time.clock() has 13 decimal points on Windows but only two on Linux.
time.time()time.time() has 17 decimals on Linux and 16 on Windows but the actual precision is different.
Described here
http://docs.python.org/library/time.html
II don't agree with the documentation that time.clock()time.clock() should be used for benchmarking on Unix/Linux. It is not precise enough.
So, so what timer to use depends on operating system.
On Linux, the time resolution is high in time.time()time.time():
time.time(), time.time() (1281384913.4374139, 1281384913.4374161)
>>> time.time(), time.time()
(1281384913.4374139, 1281384913.4374161)
On Windows, however the time functionsfunction seems to use the last called number:
time.time()-int(time.time()), time.time()-int(time.time()), time.time()-time.time() (0.9570000171661377, 0.9570000171661377, 0.0)
>>> time.time()-int(time.time()), time.time()-int(time.time()), time.time()-time.time()
(0.9570000171661377, 0.9570000171661377, 0.0)
Even if I write the calls on different lines in Windows it still returns the same value so the real precision is lower.
So in serious measurements a platform check (import platform, platform.system()import platform, platform.system()) has to be done in order to determine whether to use time.clock()time.clock() or timetime.time().time()
(Tested on Windows 7 and Ubuntu 9.10 with python 2.6 and 3.1)