I'm attempting to implement a high resolution class used to time the execution time of functions in Windows. After using this documentation, I have come up with the following:
#include <iostream>
#include <windows.h>
class Timer {
private:
LARGE_INTEGER start_count;
LARGE_INTEGER stop_count;
LARGE_INTEGER tick_freq;
public:
// Query the tick frequency upon instantiation of the class
Timer() {
QueryPerformanceFrequency(&tick_freq);
}
// Query performance counter at the start of a block of code
void start() {
QueryPerformanceCounter(&start_count);
}
// Query the performance counter at the end of a block of code, then calculate time elapsed in seconds
double stop() {
QueryPerformanceCounter(&stop_count);
return (double)(stop_count.QuadPart - start_count.QuadPart) / tick_freq.QuadPart;
}
};
int main() {
Timer t;
t.start();
// code to be timed
std::cout << t.stop();
}
I understand that there would be an error of plus or minus the period of each tick. However, various functions I have timed seem to give unexpected results. For example, timing a function that finds the greatest common denominator using Euclid's algorithm consistently uses 100 or 200ns, but appending one int to a vector takes 1600ns. This therefore leads me to the question: how accurate is this timing class, and what can I do to improve it?