### Re: Correlation of the 'error' signal in predictor and entropy of the signal?

Date: Mon Aug 19 15:22:39 2013
Posted By: Chris Seaman, Staff, Electrical Engineering, Alcoa Technical Center
Area of science: Engineering
ID: 1376603137.Eg
Message:

Your question has multiple parts, so I'll try to answer each of them.

Is a predictor like a pattern detector? -- A pattern detector is just one type of predictor. Any model can be a predictor; F = m*a can be used to predict the position and velocity of a ball you dropped. There isn't really a "pattern" in predicting the ball's position and velocity. However, you could have a pattern, such as a sine wave, that you are trying to detect in a noisy signal. In this case, your prediction equation might be y = A*sin(wt + phi), and your goal is to determine the amplitude and phase of the sine wave of a known frequency in a noisy signal.

Would the error signal of a predictor be a measure of the entropy of the signal? -- No. The error signal is only a measure of what is not being predicted. Any time I have tried to develop a predictor, I always look at the resulting error signal. If there is structure in the error signal, this represents something you didn't capture in your predictor. An ideal example of this is demodulating an AM radio signal. Inside a radio, the demodulating circuit "predicts" the amplitude and phase of the carrier signal. Once this is subtracted, the remaining "error signal" is actually the information you are trying to hear.

Entropy is a measure of the information content in a signal. You might use this as a measure of the effectiveness of your predictor. Calculate the entropy of the error signal. If the entropy of the is high, it is likely there is little remaining information. If the entropy is low, there could be more information to be extracted.

Chris Seaman
Alcoa Technical Center

Current Queue | Current Queue for Engineering | Engineering archives