SElogo.gif (5955 bytes)

 

Bootstrap and Jackknife

 

Bootstrap and Jackknife algorithms don't really give you something for nothing. They give you something you previously ignored.

The jackknife is an algorithm for re-sampling from an existing sample to get estimates of the behavior of the single sample's statistics. An example of the jackknife would be to omit the first, 2nd, 3rd, ... observation from a sample of size n. Then compute the (n-1) averages. The variance of these re-sampled averages is an estimate of the variance of the original sample mean.

The bootstrap is a generalization of the jackknife that re-samples, with replacement, some number of times (say 1000), and computes the statistic of interest from these re-samples, thereby providing an estimate of the original sample's variability. (e.g. The 95th percentile is estimated by the 950th observation of the ordered 1000 re-samples). The name, of course, comes from its apparent ability to pull itself up by its own bootstraps. (In Rudolph Erich Raspe's tale, Baron Munchausen had fallen to the bottom of a deep lake and just as he was to succumb to his fate he thought to pull himself up by his own bootstraps.)

_____________

Note:

While this simple algorithm works quite well, it is termed a  "naive" bootstrap because easily implemented improvements can be made to reduce potential bias. 

Reference:

Efron, Bradley and Robert J. Tibshirani, An Introduction to the Bootstrap, Chapman and Hall, 1993

 

SElogo.gif (5955 bytes)

[ HOME ] [ Feedback ]

Mail to Charles.Annis@StatisticalEngineering.com
Copyright � 1998-2008 Charles Annis, P.E.
Last modified: June 08, 2014