Jump to content

Maximum likelihood sequence estimation

From Wikipedia, the free encyclopedia

Maximum likelihood sequence estimation (MLSE) is a mathematical algorithm that extracts useful data from a noisy data stream.

Theory

[edit]

For an optimized detector for digital signals the priority is not to reconstruct the transmitter signal, but it should do a best estimation of the transmitted data with the least possible number of errors. The receiver emulates the distorted channel. All possible transmitted data streams are fed into this distorted channel model. The receiver compares the time response with the actual received signal and determines the most likely signal. In cases that are most computationally straightforward, root mean square deviation can be used as the decision criterion[1] for the lowest error probability.

Background

[edit]

Suppose that there is an underlying signal {x(t)}, of which an observed signal {r(t)} is available. The observed signal r is related to x via a transformation that may be nonlinear and may involve attenuation, and would usually involve the incorporation of random noise. The statistical parameters of this transformation are assumed to be known. The problem to be solved is to use the observations {r(t)} to create a good estimate of {x(t)}.

Maximum likelihood sequence estimation is formally the application of maximum likelihood to this problem. That is, the estimate of {x(t)} is defined to be a sequence of values which maximize the functional

where p(r | x) denotes the conditional joint probability density function of the observed series {r(t)} given that the underlying series has the values {x(t)}.

In contrast, the related method of maximum a posteriori estimation is formally the application of the maximum a posteriori (MAP) estimation approach. This is more complex than maximum likelihood sequence estimation and requires a known distribution (in Bayesian terms, a prior distribution) for the underlying signal. In this case the estimate of {x(t)} is defined to be a sequence of values which maximize the functional

where p(x | r) denotes the conditional joint probability density function of the underlying series {x(t)} given that the observed series has taken the values {r(t)}. Bayes' theorem implies that

In cases where the contribution of random noise is additive and has a multivariate normal distribution, the problem of maximum likelihood sequence estimation can be reduced to that of a least squares minimization.

See also

[edit]

References

[edit]
  1. ^ G. Bosco, P. Poggiolini, and M. Visintin, "Performance Analysis of MLSE Receivers Based on the Square-Root Metric," J. Lightwave Technol. 26, 2098–2109 (2008)

Further reading

[edit]
  • Andrea Goldsmith (2005). "Maximum Likelihood Sequence Estimation". Wireless Communications. Cambridge University Press. pp. 362–364. ISBN 9780521837163.
  • Philip Golden; Hervé Dedieu & Krista S. Jacobsen (2006). Fundamentals of DSL Technology. CRC Press. pp. 319–321. ISBN 9780849319136.
  • Crivelli, D. E.; Carrer, H. S., Hueda, M. R. (2005) "Performance evaluation of maximum likelihood sequence estimation receivers in lightwave systems with optical amplifiers", Latin American Applied Research, 35 (2), 95–98.
  • Katz, G., Sadot, D., Mahlab, U., and Levy, A.(2008) "Channel estimators for maximum-likelihood sequence estimation in direct-detection optical communications", Optical Engineering 47 (4), 045003. doi:10.1117/1.2904827
[edit]