Linear Mixed Model
 4 minsAll models are wrong, but some are useful.  George E.P. Box
Estimate $\psi$ via Generalized Estimating Equations (GEE)
By Theorem 4.3 of Jiang (2007), Suppose that V is known, and that is nonsingular. Then, the optimal estimating function within H is given by $G^{\ast} = \dot{\mu}^{\prime} V^{1}(y\mu) , that is, with
.
Here the optimality is in a similar sense to the univariate case. Define the partial order of nonnegative definite matrices as A $\geq$ B if A  B is nonnegative definite. Then, the optimality in Theorem 4.3 is in the sense that the estimating function $G^{\ast}$ maximizes, in the partial order of nonnegative definite matrices, the generalized information criterion
where .
The longitudinal GLMM, the optimal estimating function according to Theorem 4.3 can be expressed as
where

$y_{i} = (y_{ij})_{1 \leq j \leq 4}$

$\mu_{i} = E(y_{i}) = (\mu_{ij})_{1 \leq j \leq 4}$
*
 $V_{i} = Var(y_{i})$.
However, $V_{i}$, $1 \leq i \leq 59$, are unknown in practice. Liang and Zeger(1986) proposed replacing $V_{i}$’s with “working” covariance matrices in order to solve equation. They showed that under regualarity conditions,the resulting GEE estimator is consisitent even though the working covariance matrices misspecify the true $V_{i}$’s.
For simplicity, replace $V_{i}$ with $I_{4}$ and solve (1). That is, solve
We can derive $\mu_{i}$ and $\dot{\mu}_{i}$ analytically:
where
.
Therefore, using the law of total expectation and moment generating function of a normal random variable,
Let$x_{ijk}$ be the $k^{th}$ component of $x_{ij}$, $1 \leq k \leq 7$. Then,
Now, solve for $G^{\ast}{I} = 0$ with the constraints $\sigma{1} > 0$, $\sigma_{2} > 0$, $\rho \in [1,1]$, $\beta \in \textbf{R}^{7}$. This is an 11dimensional nonlinear equation.
Why GEE better for this model?

The computational difficulty of ML estimation has made approaches based on general estimating equations attractive. GEEs are a computationally less demanding method than ML equation.

Also, the efficiency of the likelihoodbased methods may be undermined in the case of model misspecification, which often occurs in the analysis of longitudinal data. In longitudianl studies there often exists serial correlation among the repeated measures from the same subject. Such a serial correlation may not be taken into account by a GLMM. Note that, under the GLMM assumption, the $y_{ij} \mid (\alpha_{i},\epsilon_{ij})$ are conditionally independent given the random effects, which means that no (additional) serial correlation exists once the value of the random effects are specified. However, serial correlation may exist among the repeated reponses even given the random effects. In other words, the true correaltions among the data may not have been adequately addressed by the GLMMs. GEE is applicable the cases beyond the scope of GLMM.

We don’t have to specify the covariance matrix V correctly to obtain a reasonable estimates of $\psi$. A “working” covariance matrices can also result in a consistent GEE estimator.