I am a Python and OpenCV newbie. I was dealing with OpenCV optimization and I found Measuring Performance with OpenCV website. I saw cv2.getTickCount and cv2.getTickFrequency and tried on a blank Video Capture code:
import cv2
cap = cv2.VideoCapture(0)
time1 = 0
while True:
e1 = cv2.getTickCount()
ret, frame = cap.read()
cv2.imshow("cam", frame)
e2 = cv2.getTickCount()
time1 = (e2 - e1) / cv2.getTickFrequency() + time1
print time1
k = cv2.waitKey(1) & 0xFF
if k == ord('q'):
break
At the same time, I tried time.time() for performance measuring:
import cv2
import time
cap = cv2.VideoCapture(0)
t_start = time.time()
time1 = 0
while True:
e1 = cv2.getTickCount()
ret, frame = cap.read()
cv2.imshow("cam", frame)
e2 = cv2.getTickCount()
time1 = (e2 - e1) / cv2.getTickFrequency() + time1
elapsedTime = time.time()-t_start
print [time1, elapsedTime]
k = cv2.waitKey(1) & 0xFF
if k == ord('q'):
break
There is a huge difference between elapsedTime and time1, like:
[23.544186313842033, 29.413000106811523]
[23.588920849343307, 29.460999965667725]
[23.636793986833897, 29.51200008392334]
[23.669538024648435, 29.558000087738037]
[23.701628712445952, 29.605000019073486]
[23.737225731551163, 29.65499997138977]
[23.775527056696312, 29.703999996185303]
[23.82555789141547, 29.765000104904175]
[23.864218735017026, 29.813999891281128]
[23.901782255564854, 29.861000061035156]
I checked both outputs and my phone's chronometer is with time.time()'s side.
My questions are:
- Why I have this difference? How and why
cv2.getTickCountandcv2.getTickFrequencydiffers fromtime.time() - To do the performance measurement, which one should I use?