Abstract:
Most work on the asymptotic properties of least absolute deviations (LAD) estimators makes
use of the assumption that the common distribution of the disturbances has a density which is
finite and positive at zero. We consider the implications of weakening this assumption in a
regression setting. We see that the results obtained are similar in flavor to those obtained in a
least squares context when the disturbance variance is allowed to be infinite: both the shape
of the limiting distribution and the rate of convergence to it is affected in reasonably simple
and intuitive ways. As well as conventional regression models we outline results for some
simple autoregressive models which may have a unit root and/or infinite error variance.