Chebyshev's inequality


Also found in: Encyclopedia, Wikipedia.

Chebyshev's inequality

(ˈtʃɛbɪˌʃɒfs)
n
(Statistics) statistics the fundamental theorem that the probability that a random variable differs from its mean by more than k standard deviations is less than or equal to 1/k2
[named after P. L. Chebyshev (1821–94), Russian mathematician]
Mentioned in ?
References in periodicals archive ?
What is interesting about (2) in Theorem 8 is the possibility of combining it with Chebyshev's inequality to obtain rates of convergence.
To provide confidence in the risk estimation, an upper bound of the probability is determined using Chebyshev's inequality.
It is stressed that Chebyshev's inequality is rarely used to set mean value confidence intervals (in our case fuel efficiency average for both data, experimental and calculated).
Using the Chebyshev's inequality to sin A/2, sin B/2, sin c/2 and cos A/2, cos c/2 we get
Recall that Chebyshev's inequality is a useful tool for proving that a random variable is sharply concentrated about its mean value.
To prove the law of large numbers, use Chebyshev's inequality.
Berry's improvement on Peddada's sufficient condition was derived using Chebyshev's inequality.