Overview

In this edition, we study the Bayes rule under a weighted squared loss for the Normal-Normal model. We will:

  • Derive the posterior distribution for a single observation.
  • Show that the Bayes rule has a specific linear form.
  • Analyze its admissibility by comparing its risk with another rule.

Exercise Statement

Given:

  • Data: \(X \sim \mathcal{N}(\theta, 1)\) (one observation)
  • Prior: \(\theta \sim \mathcal{N}(0, 1)\)
  • Loss function:
    $$
    L(\theta, d) = \exp\left{\frac{3\theta^2}{4} \right} (\theta – d)^2
    $$

Tasks:

a) Find the posterior distribution \(\theta \mid X = x\).

b) Show that the Bayes rule has the form \(\delta_{\pi}(x) = 2x\).

c) Is this Bayes rule admissible? Compare its risk with the rule \(\delta_1(x) = x\).


a) Posterior Distribution

We are in the standard Normal-Normal conjugate case:

  • Likelihood: \(X \mid \theta \sim \mathcal{N}(\theta, 1)\)
  • Prior: \(\theta \sim \mathcal{N}(0, 1)\)

Using the formula for the posterior distribution of the Normal-Normal model:

$$
\theta \mid X \sim \mathcal{N} \left( \frac{X}{2}, \frac{1}{2} \right)
$$


b) Deriving the Bayes Rule

We use a weighted squared loss:

$$
L(\theta, d) = w(\theta)(\theta – d)^2 \quad \text{where} \quad w(\theta) = \exp\left( \frac{3\theta^2}{4} \right)
$$

The Bayes rule under this loss is given by:

$$
\delta_\pi(X) = \frac{ \int_{\mathbb{R}} \theta \cdot w(\theta) \cdot f_{\theta \mid X}(\theta) , d\theta }{ \int_{\mathbb{R}} w(\theta) \cdot f_{\theta \mid X}(\theta) , d\theta }
$$

Now, note that:

$$
w(\theta) \cdot f_{\theta \mid X}(\theta) = \exp\left( \frac{3\theta^2}{4} \right) \cdot \mathcal{N}\left(\theta; \frac{X}{2}, \frac{1}{2} \right)
$$

This product simplifies algebraically to a normal distribution:

$$
w(\theta) f_{\mathcal{N}(X/2, ½)}(\theta) \propto f_{\mathcal{N}(2X, 2)}(\theta)
$$

Thus:

$$
\delta_\pi(X) = \frac{ \int \theta \cdot f_{\mathcal{N}(2X, 2)}(\theta) d\theta }{ \int f_{\mathcal{N}(2X, 2)}(\theta) d\theta } = \mathbb{E}_{\mathcal{N}(2X, 2)}[\theta] = 2X
$$


c) Admissibility and Risk Comparison

Let us compute the risk for both rules:

Risk of \(\delta_1(x) = x\)

We have:

$$
R(\theta, \delta_1) = \mathbb{E}_{X \mid \theta} \left[ \exp\left( \frac{3\theta^2}{4} \right) (\theta – X)^2 \right]
$$

Since \(X \sim \mathcal{N}(\theta, 1)\), we get:

$$
R(\theta, \delta_1) = \exp\left( \frac{3\theta^2}{4} \right) \cdot \operatorname{Var}(X) = \exp\left( \frac{3\theta^2}{4} \right)
$$

Risk of \(\delta_\pi(x) = 2x\)

We compute:

$$
R(\theta, \delta_\pi) = \exp\left( \frac{3\theta^2}{4} \right) \cdot \mathbb{E}_{X \mid \theta} \left[ (\theta – 2X)^2 \right]
$$

Break down the expectation:

$$
\begin{aligned} \mathbb{E}[(\theta – 2X)^2] &= \mathbb{E}[(\theta – X)^2] + \mathbb{E}[(-\theta + 3X)^2] \\ &= 1 – 2\theta^2 + 3(1 + \theta^2) \\ &= 4 + \theta^2 \end{aligned}
$$

So:

$$
R(\theta, \delta_\pi) = \exp\left( \frac{3\theta^2}{4} \right) (4 + \theta^2)
$$

Comparison:

We compare:

$$
R(\theta, \delta_\pi) = \exp\left( \frac{3\theta^2}{4} \right) (4 + \theta^2) \quad \text{vs} \quad R(\theta, \delta_1) = \exp\left( \frac{3\theta^2}{4} \right)
$$

Clearly:

$$
R(\theta, \delta_1) < R(\theta, \delta_\pi) \quad \text{for all } \theta \in \mathbb{R}
$$

Therefore, \(\delta_\pi\) is inadmissible because another rule ($\delta_1$) dominates it uniformly.


Summary

  • The posterior distribution under the Normal-Normal model remains normal.

  • The Bayes rule under a non-uniform (weighted) squared loss can result in non-standard estimators like \(\delta_\pi(x) = 2x\).

  • However, this rule may be inadmissible, as shown by comparing its risk to that of the usual estimator \(\delta_1(x) = x\).


Stay tuned for the next part!

We gratefully acknowledge Dr. Dany Djeudeu for preparing this course.