Regularity Conditions in Estimation

Common in Statistics, many core ideas in certain analyses are intuitive and simple. For example, the proofs of the Delta Method and asymptotic normality of M-estimators primarily rely on Taylor expansions. However, to make the proof rigorous, numerous “regularity conditions” are needed. These conditions have several characteristics:

  • Not the main focus of the analysis
  • Make functions or variables behave well, thereby validating the proof
  • Most examples of interest satisfy these conditions
  • Not tight: some conditions may be relaxed, but doing so may complicate the statement or proof
  • Not unique: there can be different sets of regularity conditions that work for the same proof
  • Different sets of regularity conditions may yield results with varying levels of generality or applicability

Some general regularity conditions used in estimation include:

  1. Uniform support: The support of the PDF does not depend on .
  2. Identifiability: .
  3. Interior: The parameter space is finite or an open interval; the true parameter is not on the boundary of .

For Maximum Likelihood Estimation

  • 💡 MLE is a special M-Estimator. One can also review the regularity conditions for M-Estimators. Here we discuss the regularity conditions for MLE specifically.

MLE considers the Log-Likelihood as the objective function.

Consistency

Similar to M-Estimators, we need a stronger identifiability condition and a stronger LLN condition:

  1. Separation: For any , .
  2. Uniform convergence: .
  • ❗️ Throughout the note, we do not need well-specification. The expectation is taken under the data generating distribution , and .

Under (separation) and (uniform convergence), the MLE is consistent: .


We list two sufficient conditions

  1. Compactness: is compact.
  2. Continuity: and are continuous in .
  3. Lower bound:

Asymptotic Normality

For asymptotic normality, we aim for the stronger result:

where is the Fisher Information matrix at . For this stronger result to hold, we generally requires strong continuity conditions that allow the exchange of integration and differentiation, useful in the calculation of the Fisher Information.

Beyond (identifiability), (uniform support), (interior), and the Consistency property, we need:

  1. Smoothness: is a thrice differentiable w.r.t , and on some neighborhood of for some integrable function .
  2. Exchangeability: We can interchange differentiation w.r.t up to second order and integration over .
  3. Invertibility: The Fisher Information matrix is invertible in a neighborhood of .

The above set of regularity conditions are sufficient but not necessary. For example, the (smoothness) condition can be relaxed to first-order continuous differentiability with some additional conditions on the mapping . And (exchangeability) can be implied by (smoothness) and some additional conditions. See M-Estimator for a more general set of regularity conditions.