3

I am a Python and OpenCV newbie. I was dealing with OpenCV optimization and I found Measuring Performance with OpenCV website. I saw cv2.getTickCount and cv2.getTickFrequency and tried on a blank Video Capture code:

import cv2

cap = cv2.VideoCapture(0)
time1 = 0

while True:
    e1 = cv2.getTickCount()
    ret, frame = cap.read()
    cv2.imshow("cam", frame)
    e2 = cv2.getTickCount()
    time1 = (e2 - e1) / cv2.getTickFrequency() + time1
    print time1

    k = cv2.waitKey(1) & 0xFF

    if k == ord('q'):
        break

At the same time, I tried time.time() for performance measuring:

import cv2
import time

cap = cv2.VideoCapture(0)

t_start = time.time()
time1 = 0

while True:
    e1 = cv2.getTickCount()
    ret, frame = cap.read()
    cv2.imshow("cam", frame)
    e2 = cv2.getTickCount()
    time1 = (e2 - e1) / cv2.getTickFrequency() + time1
    elapsedTime = time.time()-t_start
    print [time1, elapsedTime]

    k = cv2.waitKey(1) & 0xFF

    if k == ord('q'):
        break

There is a huge difference between elapsedTime and time1, like:

[23.544186313842033, 29.413000106811523]
[23.588920849343307, 29.460999965667725]
[23.636793986833897, 29.51200008392334]
[23.669538024648435, 29.558000087738037]
[23.701628712445952, 29.605000019073486]
[23.737225731551163, 29.65499997138977]
[23.775527056696312, 29.703999996185303]
[23.82555789141547, 29.765000104904175]
[23.864218735017026, 29.813999891281128]
[23.901782255564854, 29.861000061035156]

I checked both outputs and my phone's chronometer is with time.time()'s side.

My questions are:

  1. Why I have this difference? How and why cv2.getTickCount and cv2.getTickFrequency differs from time.time()
  2. To do the performance measurement, which one should I use?

2 Answers 2

3

I'm not really proficient with cv2 but I do see something rather iffy with your timing technique here.

Look at what part of the code you are measuring with cv2.getTickCount():

e1 = cv2.getTickCount()  # start
ret, frame = cap.read()
cv2.imshow("cam",frame)
e2 = cv2.getTickCount()  # stop

Now, look at what you are measuring using time.time():

t_start=time.time()  # start
time1=0

while True:
    e1 = cv2.getTickCount()
    ret, frame = cap.read()
    cv2.imshow("cam",frame)
    e2 = cv2.getTickCount()
    time1 = (e2 - e1)/ cv2.getTickFrequency() + time1
    elapsedTime= time.time()-t_start  # stop

You are obviously mistreating poor ol' time over here by counting different things. Unfortunately, I cannot verify the actual runtimes because I do not have OpenCV, but, you might want to put your time calls on par with getTickCount(). In short, use something similar to this:

time1=0

while True:
    t_start = time.time()  # start
    ret, frame = cap.read()
    cv2.imshow("cam",frame)
    elapsedTime= time.time()-t_start  # stop    

And re-evaluate your results, it's quite possible getTickCount() will get more accurate results than time because of the way it is implemented, I really can't know that.


As for which one should you use? cv2's internal timing module.

Why? Because, without making outlandish claims, it is a tested module that has probably been developed by people that are most likely more proficient than us with Python. Timing on your own can be a tricky thing and leaves a lot of room for little mistakes, especially when you're first starting out.

So, in short, go with getTickCount(), it is there for a reason.

Sign up to request clarification or add additional context in comments.

1 Comment

I changed the code as you suggested and I got better results: they are nearly same. [41.4705076101 41.2309985161] I also changed the code and used 1000 samples for time.time() and time1. If I print the result for every loop, difference gets bigger between these two piece of codes. In the mean time, If I print the final result at the end, I have more meaningful results
0

This should produce consistent results:

t_start = time.time()
e1 = cv2.getTickCount()

while True:
    ret, frame = cap.read()
    cv2.imshow("cam",frame)
    e2 = cv2.getTickCount()
    current_time = time.time() 
    time1 = (e2 - e1)/ cv2.getTickFrequency()
    elapsedTime= current_time - t_start
    print [time1 , elapsedTime]

Actually, I cannot reproduce your problem:

import time
import cv2

t_start = time.time()
e1 = cv2.getTickCount()

for x in range(10):
    time.sleep(0.1)
    e2 = cv2.getTickCount()
    time1 = (e2 - e1)/ cv2.getTickFrequency()
    elapsedTime= time.time() - t_start
    print 'new:', [time1 , elapsedTime]

t_start=time.time()
time1=0

for x in range(10):

    e1 = cv2.getTickCount()
    time.sleep(0.1)
    e2 = cv2.getTickCount()
    time1 = (e2 - e1)/ cv2.getTickFrequency() + time1
    elapsedTime= time.time()-t_start
    print 'old:', [time1 , elapsedTime]

Pints rather consistent values:

new: [0.101121973, 0.10170483589172363]
new: [0.202431593, 0.20299196243286133]
new: [0.303379863, 0.3039419651031494]
new: [0.404390379, 0.40494799613952637]
new: [0.505434471, 0.5059988498687744]
new: [0.606657266, 0.6072208881378174]
new: [0.70785313, 0.7083899974822998]
new: [0.808160676, 0.808696985244751]
new: [0.90930442, 0.9098358154296875]
new: [1.010264773, 1.0107979774475098]
old: [0.101093147, 0.10112500190734863]
old: [0.201946665, 0.20209193229675293]
old: [0.302889158, 0.3032569885253906]
old: [0.40368225999999996, 0.40418291091918945]
old: [0.504733644, 0.5053050518035889]
old: [0.605701044, 0.6063768863677979]
old: [0.7066711290000001, 0.707442045211792]
old: [0.807242644, 0.8081610202789307]
old: [0.908066402, 0.9092309474945068]
old: [1.008913762, 1.0102128982543945]

Are you sure the printed numbers correspond to code you show in your question?

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.