Convolution¶
Introduction¶
In signal processing, convolution is a mathematical operation on two functions f and g that produces a third function f*g, as the integral of the product of the two functions after one is reflected about the y-axis and time-shifted.
Convolutions are fundamental to time series sampled data analysis. First of all, as described earlier all linear networks can be completely characterized by their impulse response h[n] or h(t) functions and furthermore the response to any input is given by the input function x(t) convolution with the network impulse response function h(t).
Convolution integral¶
The following integral (also called Convolution integral) defines the convolution of two continuous-time signals x(t) and h(t):
$$y(t)=x(t)*h(t)=\int_{-\infty}^{\infty}x(\tau{})h(t-\tau{})d\tau{}$$
The output of any continuous-time LTI system is the convolution of the input x(t) with the impulse response h(t) of the system.
Properties of the convolution integral¶
The convolution integral has the following properties:
Commutative¶
The order in which two signals are convolved makes no difference; the results are identical:
$$x(t)*h(t)=h(t)*x(t)$$
Associative¶
$$\{x(t)*h_1(t)\}*h_2(t)=x(t)*\{h_1(t)*h_2(t)\}$$
Distributive¶
$$x(t)*\{h_1(t)+h_2(t)\}=x(t)*h_1(t)+x(t)*h_2(t)$$
Derivative¶
If,
$$y(t)=x(t)*h(t)$$
Then,
$$y(t)'=x(t)'*h(t)$$
Frequency domain¶
The convolution theorem allows one to mathematically convolve in the time domain by simply multiplying in the frequency domain. That is, if f(t) has the Fourier transform F(w) and x(t) has the Fourier transform X(w), then the convolution f(t) * x(t) has the Fourier transform F(w)X(w).
Central Limit Theorem¶
Central limit theorem mathematically explains why the Gaussian probability distribution is observed so commonly in nature.
Example of Gaussian distribution in nature :
- Human height
- Amplitude of Thermal noise
- Cross-sectional intensityof a laser beam
- Pattern of holes around a dart board bull's eye
In its most basic form, the central limit theorem says that when a variable is formed by adding many random processes, the result tends to follow a Gaussian distribution. Even if the individual processes themselves are not Gaussian, their sum will approach a Gaussian form.
Correlation¶
In signal processing, correlation is a measure of similarity between two signals, revealing how alike they are as one is shifted in time (lag). It helps detect patterns, find time delays (like echo/radar), identify periodicities, and locate signals hidden in noise. A high correlation value indicates strong similarity at a specific time shift.
Note
Correlation is a mathematical operation that is very similar to convolution. Just as with convolution, correlation uses two signals to produce a third signal. In convolution, one of the signal is mirrored about y-axis and time shifted. In correlation, there is no mirroring about y-axis. Convolution describes how a system’s input, output, and impulse response are related. Correlation, on the other hand, is used to identify a known signal hidden within noise. The fact that their math looks similar is merely a coincidence.
Cross-correlation¶
Cross-correlation in signal processing measures the similarity between two different signals as a function of time shift (lag), essentially a sliding dot product that reveals how related one signal is to a delayed version of another.
$$R_{xy}(\tau{})=\int_{-\infty{}}^{\infty{}}x(t)y(t-\tau{})dt$$
Auto-correlation¶
Autocorrelation in signal processing measures a signal's similarity with a delayed (shifted) copy of itself. It helps identify periodicities, repeating patterns within a signal.
$$R_{xx}(\tau{})=\int_{-\infty{}}^{\infty{}}x(t)x(t-\tau{})dt$$
- In pure white noise, auto-correlation (ACF) comes out to be zero except at zero lag. Thus forming a dirac-delta function. Colored/correlated noise exhibits patterns helping distinguish structured noise from randomness. Its ACF will show peaks or decay at non-zero lags, revealing hidden patterns.
- It is useful for pitch detection in speech processing by finding the fundamental period of voiced segments.
- In image processing, applying ACF to noisy images can show if noise is purely random (single bright pixel) or has patterns (streaks, textures).