Stein’s paradox Stein’s example, perhaps better known under the name Stein’s Paradox, is a well-known example in statistics that demonstrates the use of shrinkage to reduce the mean squared error ($$L_2$$-risk) of a multivariate estimator with respect to classical (unbiased) estimators, such as the maximum likelihood estimator.