# User:Geomon

## Combining unbiased estimators

Let ${\displaystyle {{\hat {\phi }}_{1}}}$ and ${\displaystyle {{\hat {\phi }}_{2}}}$ be unbiased estimators of ${\displaystyle \phi \in {{\mathbb {R} }^{k}}}$ with non-singular variances ${\displaystyle {{V}_{1}}}$ and ${\displaystyle {{V}_{2}}}$ respectively.

Then the minimum variance linear unbiased estimator of ${\displaystyle \phi }$ is obtained by combining ${\displaystyle {{\hat {\phi }}_{1}}}$ and ${\displaystyle {{\hat {\phi }}_{2}}}$ using weights that are proportional to the inverses of their variances. The result can be expressed in a variety of ways:

{\displaystyle {\begin{aligned}{\hat {\phi }}&={{\left(V_{1}^{-1}+V_{2}^{-1}\right)}^{-1}}\left(V_{1}^{-1}{{\hat {\phi }}_{1}}+V_{2}^{-1}{{\hat {\phi }}_{2}}\right)\\&={{\left(V_{1}^{-1}+V_{2}^{-1}\right)}^{-1}}\left(V_{1}^{-1}{{\hat {\phi }}_{1}}+V_{2}^{-1}{{\hat {\phi }}_{2}}\right)+&\left[{{\left(V_{1}^{-1}+V_{2}^{-1}\right)}^{-1}}V_{2}^{-1}{{\hat {\phi }}_{1}}-{{\left(V_{1}^{-1}+V_{2}^{-1}\right)}^{-1}}V_{2}^{-1}{{\hat {\phi }}_{1}}\right]\\&={{\hat {\phi }}_{1}}+{{\left(V_{1}^{-1}+V_{2}^{-1}\right)}^{-1}}V_{2}^{-1}\left({{\hat {\phi }}_{2}}-{{\hat {\phi }}_{1}}\right)\\&={{\hat {\phi }}_{1}}+{{\left(I+{{V}_{2}}V_{1}^{-1}\right)}^{-1}}\left({{\hat {\phi }}_{2}}-{{\hat {\phi }}_{1}}\right)\\&={{\left(I+{{V}_{1}}V_{2}^{-1}\right)}^{-1}}\left({{\hat {\phi }}_{1}}+{{V}_{1}}V_{2}^{-1}{{\hat {\phi }}_{2}}\right)\end{aligned}}} The proof is an application of the principle of Generalized Least-Squares. The problem can be formulated as a GLS problem by considering that: ${\displaystyle \left[{\begin{matrix}{{\hat {\phi }}_{1}}\\{{\hat {\phi }}_{2}}\\\end{matrix}}\right]=\left[{\begin{matrix}I\\I\\\end{matrix}}\right]\phi +\left[{\begin{matrix}{{\varepsilon }_{1}}\\{{\varepsilon }_{1}}\\\end{matrix}}\right]}$ with ${\displaystyle \operatorname {Var} \left(\left[{\begin{matrix}{{\varepsilon }_{1}}\\{{\varepsilon }_{1}}\\\end{matrix}}\right]\right)=\left[{\begin{matrix}{{V}_{1}}&0\\0&{{V}_{2}}\\\end{matrix}}\right]}$

Applying the GLS formula yields: {\displaystyle {\begin{aligned}{\hat {\phi }}&={{\left({{\left[{\begin{matrix}I\\I\\\end{matrix}}\right]}^{\prime }}{{\left[{\begin{matrix}{{V}_{1}}&0\\0&{{V}_{2}}\\\end{matrix}}\right]}^{-1}}\left[{\begin{matrix}I\\I\\\end{matrix}}\right]\right)}^{-1}}{{\left[{\begin{matrix}I\\I\\\end{matrix}}\right]}^{\prime }}{{\left[{\begin{matrix}{{V}_{1}}&0\\0&{{V}_{2}}\\\end{matrix}}\right]}^{-1}}\left[{\begin{matrix}{{\hat {\phi }}_{1}}\\{{\hat {\phi }}_{2}}\\\end{matrix}}\right]\\&={{\left(V_{1}^{-1}+V_{2}^{-1}\right)}^{-1}}\left(V_{1}^{-1}{{\hat {\phi }}_{1}}+V_{2}^{-1}{{\hat {\phi }}_{2}}\right)\end{aligned}}}

Help:Math

## Expected value of SSH

Consider one-way MANOVA with ${\displaystyle G}$ groups, each with ${\displaystyle n_{g}}$ observations. Let ${\displaystyle N=\sum _{g=1}^{G}n_{g}\!}$ and let

${\displaystyle D={\begin{bmatrix}1_{n_{1}}&\cdots &0\\\vdots &\ddots &\vdots \\0&\cdots &1_{n_{g}}\end{bmatrix}}}$

be the design matrix.

Let ${\displaystyle Q}$ be the ${\displaystyle N\times N}$ residual projection matrix defined by

${\displaystyle Q=I-1_{N}(1_{N}'1_{N})^{-1}1_{N}'=I-{\frac {1}{N}}U}$

## Analyzing SSH

We can find expressions for SSH in terms of the data and find expected values for SSH under a fixed effects or under a random effects model.

The following formula is used repeatedly to find the expected value of a quadratic form. If ${\displaystyle Y}$ is a random vector with ${\displaystyle \operatorname {E} (Y)=\mu }$ and ${\displaystyle \operatorname {Var} (Y)=\Psi \!}$, and ${\displaystyle Q\!}$ is symmetric, then

${\displaystyle \operatorname {E} (Y'QY)=\mu 'Q\mu +\operatorname {tr} (Q\Psi )\!}$

We can model:

${\displaystyle \mathbf {Y} =D{\mathbf {\mu }}+\epsilon \!}$

where

${\displaystyle \mu \sim N(1\psi ,\phi ^{2}I)\!}$

and

${\displaystyle \epsilon \sim N(0,\sigma ^{2}I)\!}$

and ${\displaystyle \mu }$ is independent of ${\displaystyle \epsilon }$.

Thus

${\displaystyle \operatorname {E} (Y)=1\psi }$ and ${\displaystyle \operatorname {Var} (Y)=\phi ^{2}DD'+\sigma ^{2}I\!}$

Consequently

 ${\displaystyle \operatorname {E} (SSTO)\!}$ ${\displaystyle =}$ ${\displaystyle \operatorname {E} (Y'QY)=\psi 1'Q1\psi +\operatorname {tr} \left[(\phi ^{2}DD'+\sigma ^{2}I)(I-{\frac {1}{N}}U)\right]}$ ${\displaystyle =}$ ${\displaystyle 0+\operatorname {tr} \left[\phi ^{2}DD'-{\frac {\phi ^{2}}{N}}DD'U+\sigma ^{2}Q\right]}$ ${\displaystyle =}$ ${\displaystyle \phi ^{2}N-{\frac {\phi ^{2}}{N}}\operatorname {tr} (DD'U)+\sigma ^{2}\operatorname {tr} (Q)}$ ${\displaystyle =}$ ${\displaystyle \phi ^{2}N-{\frac {\phi ^{2}}{N}}\operatorname {tr} (1'DD'1)+\sigma ^{2}(N-1)}$ ${\displaystyle =}$ ${\displaystyle \phi ^{2}N-{\frac {\phi ^{2}}{N}}\sum _{g=1}^{G}n_{g}^{2}+\sigma ^{2}(N-1)}$ ${\displaystyle =}$ ${\displaystyle \phi ^{2}(N-{\tilde {n}})+\sigma ^{2}(N-1)}$

where ${\displaystyle {\tilde {n}}=\sum _{g=1}^{G}{\frac {n_{g}}{N}}n_{g}}$ is the group-size weighted mean of group sizes. With equal groups ${\displaystyle {\tilde {n}}=N/G}$ and

${\displaystyle E(SSTO)=\phi ^{2}N{\frac {G-1}{G}}+\sigma ^{2}(N-1)=\phi ^{2}(N-n)+\sigma ^{2}(N-1)}$

Thus

 ${\displaystyle E(SSH)\!}$ = ${\displaystyle E(SSTO)-E(SSE)\!}$ = ${\displaystyle \phi ^{2}(N-{\tilde {n}})+\sigma ^{2}(N-1)-\sigma ^{2}(N-G)}$ = ${\displaystyle \phi ^{2}(N-{\tilde {n}})+\sigma ^{2}(G-1)}$

## Multivariate response

If we are sampling from a p-variate distribution in which

${\displaystyle \mathbf {Y} _{ig}\sim {\mbox{i.i.d.}}N(\mathbf {\mu } _{g},\Sigma )}$

and

${\displaystyle \mathbf {\mu } _{1},...,\mathbf {\mu } _{G}\sim {\mbox{ i.i.d. }}N(\mathbf {\psi } ,\Phi ),{\mbox{ independently of }}\mathbf {Y} _{ig}}$

then the analogous results are:

${\displaystyle E(SSE)=(N-G)\Sigma }$

and

${\displaystyle E(SSH)=(N-{\overset {\sim }{n}})\Phi +(G-1)\Sigma }$

Note that

${\displaystyle Var({\bar {\mathbf {Y} }}_{\cdot g})=\Phi +{\frac {1}{n_{g}}}\Sigma }$

and that the group-size weighted average of these variances is:

${\displaystyle \sum _{g=1}^{G}{\frac {n_{g}}{N}}Var({\bar {\mathbf {Y} }}_{\cdot g})=\sum _{g=1}^{G}{\frac {n_{g}}{N}}\left[\Phi +{\frac {1}{n_{g}}}\Sigma \right]=\Phi +{\frac {G}{N}}\Sigma }$

The expectation of combinations of ${\displaystyle SSH}$ and ${\displaystyle SSE}$ of the form ${\displaystyle k_{H}SSH+k_{E}SSE}$:

 ${\displaystyle k_{H}\!}$ ${\displaystyle k_{E}\!}$ ${\displaystyle E(k_{H}SSH+k_{E}SSE)\!}$ 1 0 ${\displaystyle (N-{\tilde {n}})\Phi +(G-1)\Sigma }$ 0 1 ${\displaystyle (N-G)\Sigma }$ ${\displaystyle {\frac {1}{N-{\overset {\sim }{n}}}}}$ 0 ${\displaystyle \Phi +{\frac {G-1}{N-{\overset {\sim }{n}}}}\Sigma }$ ${\displaystyle {\frac {G}{N(G-1)}}}$ 0 ${\displaystyle \Phi +{\frac {G}{N}}\Sigma ,{\mbox{ with equal groups}}}$ ${\displaystyle {\frac {G}{N(G-1)}}}$ ${\displaystyle -{\frac {G}{N(N-G)}}}$ ${\displaystyle \Phi ,{\mbox{ with equal groups}}}$