# Signal-to-noise ratio

**Signal-to-noise ratio** generically means the dimensionless ratio of the signal power to the noise power contained in a recording.
Abbreviated **SNR** by engineers and scientists, the signal-to-noise ratio parameterizes the performance of optimal signal processing systems when the noise is Gaussian.

## Basics

The signal \(s(t)\) may or may not have a stochastic description; the noise \(N(t)\) always does. When the signal is deterministic, its power \(P_s\) is defined to be \[P_s = \frac{1}{T} \int_{0}^{T}\!\! s^2(t)\,dt\] where \(T\) is the duration of an observation interval, which could be infinite (in which case a limit needs to be evaluated).

Special terminology is used for periodic signals.
In this case, the interval \(T\) equals the signal's period and the signal's **root mean squared** (rms) value equals the square-root of its power.
For example, the sinusoid \(A \sin 2\pi f_0 t\) has an rms value equal to \(A/\sqrt{2}\) and power \(A^2/2\ .\)

When the signal is a stationary stochastic process, its power is defined to be the value of its correlation function \(R_s(\tau)\) at the origin.
\[R_s(\tau) \equiv \mathsf{E}[s(t)s(t+\tau)];\quad P_s = R_s(0)\]
Here, \(\mathsf{E}[\cdot]\) denotes expected value.
The noise power \(P_N\) is similarly related to its correlation function\[P_N=R_N(0)\ .\]
The signal-to-noise ratio is typically written as **SNR** and equals
\[\mathrm{SNR}=\frac{P_s}{P_N}\ .\]

Signal-to-noise ratio is also defined for random variables in one of two ways.

- \(X = s+N\ ,\) where \(s\ ,\) the signal, is a constant and \(N\) is a random variable having an expected value equal to zero. The SNR equals \(s^2/\sigma^2_N\ ,\) with \(\sigma^2_N\) the variance of \(N\ .\)
- \(X = S+N\ ,\) where both \(S\) and \(N\) are random variables.

A random variable's power equals its mean-squared value: the signal power thus equals \(\mathsf{E}[S^2]\ .\) Usually, the noise has zero mean, which makes its power equal to its variance. Thus, the SNR equals \(\mathsf{E}[S^2]/\sigma^2_N\ .\)

## White Noise

When we have white noise, the noise correlation function equals \(N_0/2\cdot\delta(\tau)\ ,\) where \(\delta(\tau)\) is known both as Dirac's delta function and as an impulse.
The quantity \(N_0/2\) is the **spectral height** of the white noise and corresponds to the (constant) value of the noise power spectrum at all frequencies.
White noise power is infinite and the SNR as defined above will be zero.
White noise cannot physically exist because of its infinite power, but engineers frequently use it to describe noise that has a power spectrum that extends well beyond the signal's bandwidth.
When white noise is assumed present, optimal signal processing systems can sometimes take it into account and their performance typically depends on a modified definition of signal-to-noise ratio.
When the signal is deterministic, the SNR is taken to be
\[\mathrm{SNR}=\frac{\int\!\! s^2(t)\,dt}{N_0/2}\ .\]

## Peak Signal-to-Noise Ratio (PSNR)

In image processing, signal-to-noise ratio is defined differently.
Here, the numerator is the square of the peak value the signal *could* have and the denominator equals the noise power (noise variance).
For example, an 8-bit image has values ranging between 0 and 255.
For PSNR calculations, the numerator is 255^{2} in all cases.

## Expressing Signal-to-Noise Ratios in Decibels

Engineers frequently express SNR in decibels as \[\mathrm{SNR} (\mathrm{dB}) = 10 \log_{10}\frac{P_s}{P_N}\ .\] Engineers consider a SNR of 2 (3 dB) to be the boundary between low and high SNRs. In image processing, PSNR must be greater than about 20 dB to be considered a high-quality picture.

## Interference

These definitions implicitly assume that the signal and the noise are statistically unrelated and arise from different sources.
In many applications, some part of what is not signal arises from man-made sources and can be statistically related to the signal.
For example, a cellular telephone's signal can be corrupted by other telephone signals as well as noise.
Such non-signals are termed **interference** and a **signal-to-interference ratio**, abbreviated SIR, can be defined accordingly.
However, when both interference and noise are present, neither the SIR nor the SNR characterizes the performance of signal processing systems.

## References

## External Links

## See Also

Entropy, Gaussian Process, Mutual Information, Noise, Signal to Noise Ratio in Neuroscience