Risk

Risk is a measure of the performance of a statistical procedure. It quantifies the expected loss incurred by using a particular procedure to make decisions based on data. Formally, given a Statistical Model parameterized by and a loss function

the risk of a statistical procedure is defined as

Loss Function and Risk Function

An example for a loss function is the squared error loss , and the risk function for this loss is the Mean Squared Error.

Unfortunately, in general, the risk cannot be minimized since it depends on the unknown parameter itself (If we knew what was the actual value of , we wouldn’t need to estimate it). Therefore additional criteria for finding an optimal estimator in some sense are required.

  • Bayes Risk:
- Can incorporate prior knowledge about the parameter $\theta$.
- Can be understood as the *average* or weighted risk, with $Q$ being the weight.
  • Minimax Risk:
- Suitable for deterministic parameters.
- Can be understood as the *worst-case* risk.

A Bayes/minimax (optimal) estimator is a procedure that minimizes the Bayes/minimax risk, respectively. See also Bayes Optimality