You can find my original post here on SO. I was very surprised that I couldn't find a pre-coded function doing a mean square error between a signal (array A, size a) and a pattern (array B, size b<a) in a sliding window fashion. Thus I made one... and obviously it needs a lot of code improvement. I don't really know how to vectorize the implementation.
The goal is to take the pattern, slide it across the signal one step at a time, and determine everytime the mean square error between the signal and the pattern.
from sklearn.metrics import mean_squared_error
import numpy as np
def correlate_edge_pattern(signal, pattern):
ms_errors = list()
k=0
while True:
data = signal[k:k+pattern.size]
if data.size != pattern.size:
break
err = mean_squared_error(pattern, data)
ms_errors.append(err)
k+=1
m = np.argmin(ms_errors)
return m
As you can observe below, it works as expected:
Here is a small code sample to test it:
signal = -np.ones(1000)*20
signal[:100] = 0
signal[900:] = 0
noise = np.random.normal(0,1,size=1000)
noisy_signal = signal + noise
pattern = np.zeros(50)
pattern[:25] = -20
