The Central Limit Theorem assures that Maximum Likelihood Estimators have, asymptotically, a multivariate normal density. As a consequence of that normal behavior, Log Likelihood Ratios have a Chi-Square density. That means we can use as a measure of closeness in the neighborhood of the MLE itself, and thus we have a criterion for constructing confidence bounds.
(It's kinda useful.)
The normal distribution is the parent density for many other distributions, and has a close familial relationship with many more. For example, Sums of Squares of samples from the standard normal distribution have a chi-square distribution, .
Now, if something has a probability density, then we can evaluate the probability that the something takes on values as extreme, or more extreme, than what we are interested in. In our case we have the distribution of the logarithm of the ratio of the Weibull parameters, , to their maximum values, the MLEs. We can move the , pair away from their maximum likelihood values and see the effect on the Weibull model. If we don't move too far the resulting model will still be plausible, but not optimal (given the data).
How far is too far? If we choose a 95% confidence neighborhood, then the distance is /2 (evaluated at 2 degrees-of-freedom, since we have two model parameters)(1).
We now have everything we need to construct a confidence neighborhood around the MLEs for the (, ) pair. Every point on that boundary will correspond to a (, ) pair, and each pair represents a Weibull model. Construct all the models: the locus of their extremes is the corresponding 95% confidence bound on the Weibull model.