First post on Code Review SE, so here goes.
The Background
I'm working on a real-time, embedded system image processing application for my group engineering capstone in undergrad. I'm receiving data at 60FPS, and have to isolate and detect the location of a flying object in each frame, if it exists, before the next frame. This gives me about 15ms to perform the entire image processing algorithm.
One important step in the process is denoising the image. The input to the denoising function is an M by N\$MxN\$ image, obtained by background subtraction/frame differencing. Each pixel is represented by a single bit. (Basically we take the frame at time t+1\$t+1\$, subtract it from the frame at time t\$t\$, and if the absolute difference is above some threshold, we set the bit equal to 1.) This means that we store the entire image in M*N/8\$\frac{M*N}{8}\$ bytes.
The Problem
The Problem
It's too slow. By, by an order of magnitude. This runs in about 109ms on my hardware for a given image (320x240 in my case). It should be running around 10-12ms.
My Main Questions
- Is the slowness I'm experiencing due to the nature of the job I'm trying to do, or to my implementation?
- How can I speed it up?
Thanks forIs the help.slowness I'm experiencing due to the nature of the job I'm trying to do, or to my implementation? How can I speed it up?