hide long namesshow long names
hide short namesshow short names
Integer type:  int32  int64  nag_int  show int32  show int32  show int64  show int64  show nag_int  show nag_int

PDF version (NAG web site, 64-bit version, 64-bit version)
Chapter Contents
Chapter Introduction
NAG Toolbox

NAG Toolbox: nag_correg_mixeff_hier_reml (g02jd)

Purpose

nag_correg_mixeff_hier_reml (g02jd) fits a multi-level linear mixed effects regression model using restricted maximum likelihood (REML). Prior to calling nag_correg_mixeff_hier_reml (g02jd) the initialization function nag_correg_mixeff_hier_init (g02jc) must be called.

Syntax

[gamma, effn, rnkx, ncov, lnlike, id, b, se, czz, cxx, cxz, ifail] = g02jd(vpr, nvpr, gamma, lb, rcomm, icomm, iopt, ropt, 'lvpr', lvpr, 'liopt', liopt, 'lropt', lropt)
[gamma, effn, rnkx, ncov, lnlike, id, b, se, czz, cxx, cxz, ifail] = nag_correg_mixeff_hier_reml(vpr, nvpr, gamma, lb, rcomm, icomm, iopt, ropt, 'lvpr', lvpr, 'liopt', liopt, 'lropt', lropt)

Description

nag_correg_mixeff_hier_reml (g02jd) fits a model of the form:
y = Xβ + Zν + ε
y=Xβ+Zν+ε
where yy is a vector of nn observations on the dependent variable,
XX is a known nn by pp design matrix for the fixed independent variables,
ββ is a vector of length pp of unknown fixed effects,
ZZ is a known nn by qq design matrix for the random independent variables,
νν is a vector of length qq of unknown random effects,
and εε is a vector of length nn of unknown random errors.
Both ν ν  and ε ε  are assumed to have a Gaussian distribution with expectation zero and variance/covariance matrix defined by
Var
[ ν ε ]
=
[ G 0 0 R ]
Var[ ν ε ] = [ G 0 0 R ]
where R = σR2 I R= σ R 2 I , I I  is the n × n n×n  identity matrix and G G  is a diagonal matrix. It is assumed that the random variables, Z Z , can be subdivided into g q g q  groups with each group being identically distributed with expectation zero and variance σi2 σi2 . The diagonal elements of matrix G G  therefore take one of the values {σi2 : i = 1,2,,g} { σi2 : i=1,2,,g } , depending on which group the associated random variable belongs to.
The model therefore contains three sets of unknowns: the fixed effects β β, the random effects ν ν  and a vector of g + 1 g+1  variance components γ γ , where γ = {σ12,σ22,,σg12,σg2,σR2} γ = {σ12,σ22,, σ g-1 2 ,σg2,σR2} . Rather than working directly with γ γ , nag_correg_mixeff_hier_reml (g02jd) uses an iterative process to estimate γ* = { σ12 / σR2 , σ22 / σR2 ,, σg12 / σR2 , σg2 / σR2 ,1} γ* = { σ12 / σR2 , σ22 / σR2 ,, σg-12 / σR2 , σg2 / σR2 ,1} . Due to the iterative nature of the estimation a set of initial values, γ0 γ0 , for γ* γ*  is required. nag_correg_mixeff_hier_reml (g02jd) allows these initial values either to be supplied by you or calculated from the data using the minimum variance quadratic unbiased estimators (MIVQUE0) suggested by Rao (1972).
nag_correg_mixeff_hier_reml (g02jd) fits the model by maximizing the restricted log-likelihood function:
2 lR = log(|V|) + (np) log(rTV1r) + log|XTV1X| + (np) (1 + log(2π / (np)))
-2 l R = log( |V| ) + ( n-p ) log( rT V-1 r ) + log| XT V-1 X | + ( n-p ) ( 1+ log( 2 π / ( n-p ) ) )
where
V = ZG ZT + R,   r = yXb   and   b = (XTV1X)1 XT V1 y .
V = ZG ZT + R,   r=y-Xb   and   b = ( XT V-1 X )-1 XT V-1 y .
Once the final estimates for γ* γ *  have been obtained, the value of σR2 σR2  is given by
σR2 = (rTV1r) / (np) .
σR2 = ( rT V-1 r ) / ( n - p ) .
Case weights, Wc Wc , can be incorporated into the model by replacing XTX XTX  and ZTZ ZTZ  with XTWcX XTWcX  and ZTWcZ ZTWcZ  respectively, for a diagonal weight matrix Wc Wc .
The log-likelihood, lRlR, is calculated using the sweep algorithm detailed in Wolfinger et al. (1994).

References

Goodnight J H (1979) A tutorial on the SWEEP operator The American Statistician 33(3) 149–158
Harville D A (1977) Maximum likelihood approaches to variance component estimation and to related problems JASA 72 320–340
Rao C R (1972) Estimation of variance and covariance components in a linear model J. Am. Stat. Assoc. 67 112–115
Stroup W W (1989) Predictable functions and prediction space in the mixed model procedure Applications of Mixed Models in Agriculture and Related Disciplines Southern Cooperative Series Bulletin No. 343 39–48
Wolfinger R, Tobias R and Sall J (1994) Computing Gaussian likelihoods and their derivatives for general linear mixed models SIAM Sci. Statist. Comput. 15 1294–1310

Parameters

Note: prior to calling nag_correg_mixeff_hier_reml (g02jd) the initialization function nag_correg_mixeff_hier_init (g02jc) must be called, therefore this documention should be read in conjunction with the document for nag_correg_mixeff_hier_init (g02jc).
In particular some parameter names and conventions described in that document are also relevant here, but their definition has not been repeated. Specifically, rndm, weight, n, nff, nrf, nlsv, levels, fixed, dat, licomm and lrcomm should be interpreted identically in both functions.

Compulsory Input Parameters

1:     vpr(lvpr) – int64int32nag_int array
lvpr, the dimension of the array, must satisfy the constraint lvpr = iRNDM(1,i) + RNDM(2,i)lvpr=iRNDM(1,i)+RNDM(2,i).
A vector of flags indicating the mapping between the random variables specified in rndm and the variance components, σi2σi2. See Section [Further Comments] for more details.
Constraint: 1vpr(i)nvpr1vprinvpr, for i = 1,2,,lvpri=1,2,,lvpr.
2:     nvpr – int64int32nag_int scalar
gg, the number of variance components being estimated (excluding the overall variance, σR2σR2).
Constraint: 1nvprlvpr 1nvprlvpr .
3:     gamma(nvpr + 1nvpr+1) – double array
Holds the initial values of the variance components, γ0 γ0 , with gamma(i)gammai the initial value for σi2 / σR2σi2/σR2, for i = 1,2,,nvpri=1,2,,nvpr.
If gamma(1) = 1.0gamma1=-1.0, the remaining elements of gamma are ignored and the initial values for the variance components are estimated from the data using MIVQUE0.
Constraint: gamma(1) = 1.0 ​ or ​ gamma(i)0.0gamma1=-1.0 ​ or ​ gammai0.0, for i = 1,2,,gi=1,2,,g.
4:     lb – int64int32nag_int scalar
The dimension of the arrays b and se and the second dimension of the array id as declared in the (sub)program from which nag_correg_mixeff_hier_reml (g02jd) is called.
Constraint: lbnff + nrf × nlsvlbnff+nrf×nlsv.
5:     rcomm( : :) – double array
Note: the dimension of the array rcomm must be at least lrcommlrcomm (see nag_correg_mixeff_hier_init (g02jc)).
Communication array initialized by a call to nag_correg_mixeff_hier_init (g02jc).
6:     icomm( : :) – int64int32nag_int array
Note: the dimension of the array icomm must be at least licommlicomm (see nag_correg_mixeff_hier_init (g02jc)).
Communication array initialized by a call to nag_correg_mixeff_hier_init (g02jc).
7:     iopt(liopt) – int64int32nag_int array
Optional parameters passed to the optimization function.
By default nag_correg_mixeff_hier_reml (g02jd) fits the specified model using a modified Newton optimization algorithm as implemented in nag_opt_bounds_mod_deriv2_comp (e04lb). In some cases, where the calculation of the derivatives is computationally expensive it may be more efficient to use a sequential QP algorithm. The sequential QP algorithm as implemented in (e04uc) can be chosen by setting iopt(5) = 1iopt5=1. If liopt < 4liopt<4 or iopt(5)1iopt51 then nag_opt_bounds_mod_deriv2_comp (e04lb) will be used.
Different optional parameters are available depending on the optimization function used. In all cases, using a value of 1-1 will cause the default value to be used. In addition only the first liopt values of iopt are used, so for example, if only the first element of iopt needs changing and default values for all other optional parameters are sufficient liopt can be set to 11.
nag_opt_bounds_mod_deriv2_comp (e04lb) is being used


ii


Description
Equivalent
nag_opt_bounds_mod_deriv2_comp (e04lb)
parameter


Default Value
11 Number of iterations maxcal 10001000
22 Unit number for monitoring information n/a As returned by nag_file_set_unit_advisory (x04ab)
33 Print optional parameters (1 = 1= print) n/a 1-1 (no printing performed)
44 Frequency that monitoring information is printed iprint 1-1
55 Optimizer used n/a n/a
If requested, monitoring information is displayed in a similar format to that given by nag_opt_bounds_mod_deriv2_comp (e04lb).
(e04uc) is being used


ii


Description
Equivalent
(e04uc)
parameter


Default Value
11 Number of iterations Major Iteration Limit max(50,3 × nvpr)max(50,3×nvpr)
22 Unit number for monitoring information n/a As returned by nag_file_set_unit_advisory (x04ab)
33 Print optional parameters (1 = 1= print, otherwise no print) List/Nolist 1-1 (no printing performed)
44 Frequency that monitoring information is printed Major Print Level 00
55 Optimizer used n/a n/a
66 Number of minor iterations Minor Iteration Limit max(50,3 × nvpr)max(50,3×nvpr)
77 Frequency that additional monitoring information is printed Minor Print Level 00
8:     ropt(lropt) – double array
Optional parameters passed to the optimization function.
Different optional parameters are available depending on the optimization function used. In all cases, using a value of 1.0-1.0 will cause the default value to be used. In addition only the first lropt values of ropt are used, so for example, if only the first element of ropt needs changing and default values for all other optional parameters are sufficient lropt can be set to 11.
nag_opt_bounds_mod_deriv2_comp (e04lb) is being used


ii


Description
Equivalent
nag_opt_bounds_mod_deriv2_comp (e04lb)
parameter


Default Value
11 Sweep tolerance n/a max(sqrt(eps),sqrt(eps) × maxi (zzii))max(eps,eps×maxi(zzii))
22 Accuracy of linear minimizations eta 0.90.9
33 Accuracy to which solution is required xtol 0.00.0
44 Initial distance from solution stepmx 100000.0100000.0
(e04uc) is being used


ii


Description
Equivalent
(e04uc)
parameter


Default Value
11 Sweep tolerance n/a max(sqrt(eps),sqrt(eps) × maxi (zzii))max(eps,eps×maxi(zzii))
22 Lower bound for γ*γ* n/a eps / 100eps/100
33 Upper bound for γ*γ* n/a 10201020
44 Line search tolerance Line Search Tolerance 0.90.9
55 Optimality tolerance Optimality Tolerance eps0.72eps0.72
where epseps is the machine precision returned by nag_machine_precision (x02aj) and zziizzii denotes the ii diagonal element of ZTZZTZ.

Optional Input Parameters

1:     lvpr – int64int32nag_int scalar
Default: The dimension of the array vpr.
The sum of the number of random parameters and the random intercept flags specified in the call to nag_correg_mixeff_hier_init (g02jc).
Constraint: lvpr = iRNDM(1,i) + RNDM(2,i)lvpr=iRNDM(1,i)+RNDM(2,i).
2:     liopt – int64int32nag_int scalar
Default: The dimension of the array iopt.
Length of the options array iopt. If liopt0liopt0 then iopt is not referenced and default values are used for all optional parameters.
3:     lropt – int64int32nag_int scalar
Default: The dimension of the array ropt.
Length of the options array ropt. If lropt0lropt0 then ropt is not referenced and default values are used for all optional parameters.

Input Parameters Omitted from the MATLAB Interface

ldid ldczz ldcxx ldcxz

Output Parameters

1:     gamma(nvpr + 1nvpr+1) – double array
gamma(i)gammai, for i = 1,2,,nvpri=1,2,,nvpr, holds the final estimate of σi2σi2 and gamma(nvpr + 1)gammanvpr+1 holds the final estimate for σR2σR2.
2:     effn – int64int32nag_int scalar
Effective number of observations. If there are no weights (i.e., weight = 'U'weight='U'), or all weights are nonzero, then effn = neffn=n.
3:     rnkx – int64int32nag_int scalar
The rank of the design matrix, XX, for the fixed effects.
4:     ncov – int64int32nag_int scalar
Number of variance components not estimated to be zero. If none of the variance components are estimated to be zero, then ncov = nvprncov=nvpr.
5:     lnlike – double scalar
2 lR (γ̂) - 2 lR (γ^) where lR lR is the log of the restricted maximum likelihood calculated at γ̂ γ^ , the estimated variance components returned in gamma.
6:     id(ldid,lb) – int64int32nag_int array
ldid = icomm(5)ldid=icomm 5 .
An array describing the parameter estimates returned in b. The first nlsv × nrfnlsv×nrf columns of id describe the parameter estimates for the random effects and the last nff columns the parameter estimates for the fixed effects.
A print function for decoding the parameter estimates given in b using information from id is supplied with the example program for this function.
For fixed effects:
  • for l = nrf × nlsv + 1 ,, nrf × nlsv + nff l=nrf×nlsv+1 ,, nrf×nlsv+nff
    • if b(l)bl contains the parameter estimate for the intercept then
      id(1,l) = id(2,l) = id(3,l) = 0 ;
      id1l = id2l = id3l = 0 ;
    • if b(l)bl contains the parameter estimate for the iith level of the jjth fixed variable, that is the vector of values held in the kkth column of dat when fixed(j + 2) = kfixedj+2=k then
      id(1,l) = 0, 
      id(2,l) = j, 
      id(3,l) = i;
      id1l=0,  id2l=j,  id3l=i;
    • if the jjth variable is continuous or binary, that is levels(fixed(j + 2)) = 1levelsfixedj+2=1, then id(3,l) = 0id3l=0;
    • any remaining rows of the llth column of id are set to 00.
For random effects:
  • let
    • NRbNRb denote the number of random variables in the bbth random statement, that is NRb = RNDM(1,b)NRb=RNDM(1,b);
    • RjbRjb denote the jjth random variable from the bbth random statement, that is the vector of values held in the kkth column of dat when RNDM(2 + j,b) = kRNDM(2+j,b)=k;
    • NSbNSb denote the number of subject variables in the bbth random statement, that is NSb = RNDM(3 + NRb,b)NSb=RNDM(3+NRb,b);
    • SjbSjb denote the jjth subject variable from the bbth random statement, that is the vector of values held in the kkth column of dat when RNDM(3 + NRb + j,b) = kRNDM(3+NRb+j,b)=k;
    • L(Sjb)L(Sjb) denote the number of levels for SjbSjb, that is L(Sjb) = levels(RNDM(3 + NRb + j,b))L(Sjb)=levelsRNDM(3+NRb+j,b);
  • then
    • for l = 1,2, nrf × nlsvl=1,2, nrf×nlsv, if b(l)bl contains the parameter estimate for the iith level of RjbRjb when Skb = skSkb=sk, for k = 1,2,,NSbk=1,2,,NSb and 1skL(Sjb)1skL(Sjb), i.e., sksk is a valid value for the kkth subject variable, then
      id(1,l) = b, 
      id(2,l) = j, 
      id(3,l) = i, 
      id(3 + k,l) = sk, ​k = 1,2,,NSb;
      id1l=b,  id2l=j,  id3l=i,  id3+kl=sk, ​k=1,2,,NSb;
    • if the parameter being estimated is for the intercept then id(2,l) = id(3,l) = 0id2l=id3l=0;
    • if the jjth variable is continuous, or binary, that is L(Sjb) = 1L(Sjb)=1, then id(3,l) = 0id3l=0;
    • the remaining rows of the llth column of id are set to 00.
In some situations, certain combinations of variables are never observed. In such circumstances all elements of the llth row of id are set to 999-999.
7:     b(lb) – double array
The parameter estimates, with the first nrf × nlsvnrf×nlsv elements of bb containing the parameter estimates for the random effects, νν, and the remaining nff elements containing the parameter estimates for the fixed effects, ββ. The order of these estimates are described by the id parameter.
8:     se(lb) – double array
The standard errors of the parameter estimates given in b.
9:     czz(ldczz, : :) – double array
The first dimension of the array czz will be nrfnrf
The second dimension of the array will be nrf × nlsvnrf×nlsv
ldczznrfldczznrf.
If nlsv = 1nlsv=1, then czz holds the lower triangular portion of the matrix (1 / σ2) (ZT1Z + 1) ( 1/ σ 2 ) ( ZT R^-1 Z + G^-1 ) , where R^ and G^ are the estimates of RR and GG respectively. If nlsv > 1nlsv>1 then czz holds this matrix in compressed form, with the first nrf columns holding the part of the matrix corresponding to the first level of the overall subject variable, the next nrf columns the part corresponding to the second level of the overall subject variable etc.
10:   cxx(ldcxx, : :) – double array
The first dimension of the array cxx will be nffnff
The second dimension of the array will be nffnff
ldcxxnffldcxxnff.
cxx holds the lower triangular portion of the matrix (1 / σ2) XT 1 X ( 1/ σ2 ) XT V^-1 X , where V^ is the estimated value of VV.
11:   cxz(ldcxz, : :) – double array
The first dimension of the array cxz will be nffnff
The second dimension of the array will be nlsv × nrfnlsv×nrf
ldcxznffldcxznff.
If nlsv = 1nlsv=1, then cxz holds the matrix (1 / σ2) (XT1Z) (1/σ2) ( XT V^-1 Z ) G^ , where V^ and G^ are the estimates of VV and GG respectively. If nlsv > 1nlsv>1 then cxz holds this matrix in compressed form, with the first nrf columns holding the part of the matrix corresponding to the first level of the overall subject variable, the next nrf columns the part corresponding to the second level of the overall subject variable etc.
12:   ifail – int64int32nag_int scalar
ifail = 0ifail=0 unless the function detects an error (see [Error Indicators and Warnings]).

Error Indicators and Warnings

Errors or warnings detected by the function:

Cases prefixed with W are classified as warnings and do not generate an error of type NAG:error_n. See nag_issue_warnings.

  ifail = 1ifail=1
lvpr is too small.
  ifail = 2ifail=2
Constraint: 1vpr(i)nvpr1vprinvpr.
  ifail = 3ifail=3
Constraint: .
  ifail = 4ifail=4
Constraint: gamma(i)0gammai0.
  ifail = 9ifail=9
lb is too small. lb is too small.
  ifail = 11ifail=11
ldid is too small.
  ifail = 15ifail=15
ldczz is too small.
  ifail = 17ifail=17
ldcxx is too small.
  ifail = 19ifail=19
ldcxz is too small.
  ifail = 21ifail=21
On entry, icomm has not been initialized correctly. On entry, icomm has not been initialized correctly. On entry, icomm has not been initialized correctly. On entry, icomm has not been initialized correctly. On entry, icomm has not been initialized correctly. On entry, icomm has not been initialized correctly.
  ifail = 32ifail=32
On entry, at least one value of i, for i = 1,2,,nvpri=1,2,,nvpr, does not appear in vpr.
W ifail = 101ifail=101
Optimal solution found, but requested accuracy not achieved.
  ifail = 102ifail=102
Too many major iterations.
W ifail = 103ifail=103
Current point cannot be improved upon.
W ifail = 104ifail=104
At least one negative estimate for gamma was obtained. All negative estimates have been set to zero.
  ifail = 999ifail=-999
Dynamic memory allocation failed.

Accuracy

Not applicable.

Further Comments

The parameter vpr gives the mapping between the random variables and the variance components. In most cases vpr(i) = ivpri=i, for i = 1,2,,iRNDM(1,i) + RNDM(2,i)i=1,2,,iRNDM(1,i)+RNDM(2,i). However, in some cases it might be necessary to associate more than one random variable with a single variance component, for example, when the columns of dat hold dummy variables.
Consider a dataset with three variables:
DAT =
  113.6 214.5 311.1 128.3 227.2 326.1  
DAT= 113.6 214.5 311.1 128.3 227.2 326.1
where the first column corresponds to a categorical variable with three levels, the next to a categorical variable with two levels and the last column to a continuous variable. So in a call to nag_correg_mixeff_hier_init (g02jc)
levels =
(321)
levels=321
also assume a model with no fixed effects, no random intercept, no nesting and all three variables being included as random effects, then
RNDM =
(00)
fixed = ;
(30123)
T
.
fixed=00; RNDM=30123T.
Each of the three columns in dat therefore correspond to a single variable and hence there are three variance components, one for each random variable included in the model, so
vpr =
(123)
.
vpr=123.
This is the recommended way of supplying the data to nag_correg_mixeff_hier_reml (g02jd), however it is possible to reformat the above dataset by replacing each of the categorical variables with a series of dummy variables, one for each level. The dataset then becomes
DAT =
  100103.6 010104.5 001101.1 100018.3 010017.2 001016.1  
DAT= 100103.6 010104.5 001101.1 100018.3 010017.2 001016.1
where each column only has one level
levels =
(111111)
.
levels= 111111 .
Again a model with no fixed effects, no random intercept, no nesting and all variables being included as random effects is required, so
RNDM =
(00)
fixed = ;
(60123456)
T
.
fixed=00 ; RNDM= 60123456T .
With the data entered in this manner, the first three columns of dat correspond to a single variable (the first column of the original dataset) as do the next two columns (the second column of the original dataset). Therefore vpr must reflect this
vpr =
(111223)
.
vpr= 111223 .
In most situations it is more efficient to supply the data to nag_correg_mixeff_hier_init (g02jc) in terms of categorical variables rather than transform them into dummy variables.

Example

function nag_correg_mixeff_hier_reml_example
% Problem size
n = 90;
ncol = 12;
nrndm = 3;
nvpr = int64(7);

% The number of levels associated with each of the independent variables
levels = [int64(2); 3; 2; 3; 2; 3; 1; 4; 5; 2; 3; 3];

% The Fixed part of the model
fixed = [int64(2); 1; 1; 2];

% The Random part of the model
rndm = [int64(2), 2,  3;
              0,  0,  0;
              3,  5,  7;
              4,  6,  8;
              3,  2,  9;
             10, 11,  1;
             11, 12, 12;
             12,  0,  0];

% Dependant data
y = [3.1100; 2.8226; 7.4543; 4.4313; 6.1543; -0.1783; 4.6748; 7.0667;
     1.4262; 7.7290; -2.1806; 6.8419; 1.2590; 8.8405; 6.1657; -4.5605;
    -1.2367; -12.2932; -2.3374; 0.0716; 0.1895; 1.5608; -0.8529; -4.1169;
     3.9977; -8.1277; -4.9656; -0.6428; -5.5152; -5.5657; 14.8177; 16.9783;
    13.8966; 14.8166; 19.3640; 9.5299; 12.0102; 6.1551; -1.7048; 2.7640;
     2.8065; 0.0974; -7.8080; -18.0450; -2.8199; 8.9893; 3.7978; -6.3493;
     8.1411; -7.5483; -0.4600; -3.2135; -6.6562; 5.1267; 3.5592; -4.4420;
    -8.5965; -6.3187; -7.8953; -10.1383; -7.8850; 23.2001; 5.5829; -4.3698;
     2.1274; -2.7184; -17.9128; -1.2708; -24.2735; -14.7374; 0.1713; 8.0006;
     1.2100; 3.3307; -22.6713; 7.5562; -7.0694; 3.7159; -4.3135; -14.5577;
   -12.5107; 4.7708; 13.2797; -6.3243; -7.0549; -9.2713; -18.7788; -7.7230;
   -22.7230; -11.6609];

% Independent data
dat = [1, 3, 2, 1, 2, 2,-0.3160, 4, 2, 1, 1, 1;
       1, 1, 1, 3, 1, 2,-1.3377, 1, 4, 1, 1, 1;
       1, 3, 1, 3, 1, 3,-0.7610, 4, 2, 1, 1, 1;
       2, 3, 2, 1, 1, 3,-2.2976, 4, 2, 1, 1, 1;
       2, 2, 1, 3, 2, 3,-0.4263, 2, 1, 1, 1, 1;
       2, 1, 2, 3, 1, 3, 1.4067, 3, 3, 2, 1, 1;
       2, 3, 2, 1, 2, 1,-1.4669, 1, 2, 2, 1, 1;
       1, 1, 1, 3, 2, 3, 0.4717, 2, 4, 2, 1, 1;
       1, 3, 2, 3, 2, 1, 0.4436, 1, 3, 2, 1, 1;
       1, 1, 1, 2, 2, 3,-0.5950, 3, 4, 2, 1, 1;
       1, 3, 1, 3, 1, 1,-1.7981, 4, 2, 1, 2, 1;
       2, 3, 1, 2, 1, 1, 0.2397, 1, 4, 1, 2, 1;
       1, 2, 2, 1, 2, 3, 0.4742, 1, 1, 1, 2, 1;
       2, 2, 2, 2, 2, 3, 0.6888, 3, 1, 1, 2, 1;
       2, 1, 2, 3, 1, 3,-1.0616, 3, 5, 1, 2, 1;
       1, 2, 2, 2, 2, 1,-0.5356, 1, 3, 2, 2, 1;
       1, 3, 2, 2, 1, 1,-1.2963, 2, 5, 2, 2, 1;
       1, 2, 2, 1, 2, 2,-1.5389, 3, 2, 2, 2, 1;
       2, 3, 1, 1, 2, 2,-0.6408, 2, 1, 2, 2, 1;
       1, 2, 2, 2, 1, 1, 0.6574, 1, 1, 2, 2, 1;
       2, 1, 1, 1, 1, 3, 0.9259, 1, 2, 1, 3, 1;
       2, 2, 2, 1, 2, 2, 1.5080, 3, 1, 1, 3, 1;
       2, 3, 1, 1, 1, 3, 2.5821, 2, 3, 1, 3, 1;
       1, 2, 2, 1, 2, 3, 0.4102, 1, 4, 1, 3, 1;
       2, 1, 2, 3, 2, 2, 0.7839, 2, 5, 1, 3, 1;
       1, 2, 2, 3, 2, 1,-1.8812, 4, 2, 2, 3, 1;
       1, 2, 1, 3, 2, 3, 0.7770, 4, 1, 2, 3, 1;
       2, 2, 1, 2, 1, 3, 0.2590, 3, 1, 2, 3, 1;
       2, 3, 2, 2, 2, 3,-0.9250, 3, 3, 2, 3, 1;
       2, 2, 1, 3, 2, 3,-0.4831, 1, 5, 2, 3, 1;
       2, 2, 1, 3, 1, 3, 0.5046, 3, 3, 1, 1, 2;
       2, 1, 1, 2, 2, 1,-0.6903, 2, 1, 1, 1, 2;
       1, 3, 2, 2, 2, 1, 1.6166, 2, 5, 1, 1, 2;
       2, 2, 2, 2, 1, 3, 0.2778, 2, 3, 1, 1, 2;
       2, 3, 2, 2, 1, 2, 1.9586, 4, 2, 1, 1, 2;
       1, 3, 1, 1, 1, 3, 1.0506, 2, 5, 2, 1, 2;
       2, 1, 1, 3, 2, 3, 0.4871, 1, 1, 2, 1, 2;
       2, 1, 2, 3, 2, 1, 2.0891, 4, 4, 2, 1, 2;
       1, 2, 1, 1, 2, 2, 1.4338, 4, 3, 2, 1, 2;
       1, 1, 2, 3, 1, 2,-1.1196, 3, 4, 2, 1, 2;
       1, 3, 1, 1, 2, 1, 0.3367, 3, 2, 1, 2, 2;
       2, 2, 1, 3, 1, 1, 0.1092, 2, 2, 1, 2, 2;
       1, 1, 1, 2, 2, 2, 0.4007, 4, 1, 1, 2, 2;
       2, 3, 1, 1, 1, 2, 0.1460, 3, 5, 1, 2, 2;
       2, 1, 2, 3, 1, 3,-0.3877, 3, 4, 1, 2, 2;
       1, 1, 1, 2, 2, 1, 0.6957, 4, 3, 2, 2, 2;
       2, 1, 1, 1, 2, 1,-0.4664, 3, 3, 2, 2, 2;
       1, 1, 1, 1, 2, 3, 0.2067, 2, 4, 2, 2, 2;
       2, 1, 2, 1, 1, 2, 0.4112, 1, 4, 2, 2, 2;
       2, 2, 1, 1, 1, 2,-1.3734, 3, 3, 2, 2, 2;
       2, 1, 2, 3, 1, 3, 0.7065, 1, 3, 1, 3, 2;
       1, 2, 2, 2, 1, 2, 1.3628, 4, 2, 1, 3, 2;
       2, 1, 2, 2, 2, 3,-0.5052, 4, 5, 1, 3, 2;
       2, 1, 1, 1, 2, 1,-1.3457, 2, 5, 1, 3, 2;
       1, 1, 2, 1, 2, 3,-1.8022, 3, 4, 1, 3, 2;
       2, 3, 1, 2, 1, 1, 0.0116, 2, 4, 2, 3, 2;
       2, 2, 1, 3, 2, 3,-0.9075, 1, 3, 2, 3, 2;
       2, 2, 2, 2, 2, 3,-1.4707, 1, 1, 2, 3, 2;
       2, 2, 1, 1, 2, 1,-1.2938, 2, 3, 2, 3, 2;
       1, 3, 1, 3, 2, 2,-1.1660, 4, 4, 2, 3, 2;
       1, 2, 1, 1, 2, 3, 0.0397, 4, 4, 1, 1, 3;
       1, 3, 1, 2, 1, 3,-0.5987, 3, 2, 1, 1, 3;
       2, 3, 2, 2, 1, 1, 0.6683, 3, 3, 1, 1, 3;
       2, 2, 1, 1, 2, 2,-0.0106, 1, 3, 1, 1, 3;
       1, 2, 1, 3, 2, 2, 0.5885, 1, 3, 1, 1, 3;
       1, 1, 1, 1, 1, 2, 0.4555, 1, 5, 2, 1, 3;
       2, 2, 2, 1, 1, 2, 0.6502, 4, 3, 2, 1, 3;
       1, 1, 1, 3, 1, 1,-0.1601, 1, 3, 2, 1, 3;
       2, 2, 1, 3, 2, 3, 1.6910, 1, 1, 2, 1, 3;
       2, 2, 2, 3, 1, 2, 0.1053, 4, 4, 2, 1, 3;
       2, 1, 2, 3, 2, 2,-0.4037, 3, 4, 1, 2, 3;
       1, 3, 2, 3, 1, 3,-0.5853, 3, 2, 1, 2, 3;
       2, 3, 2, 1, 1, 1,-0.3037, 1, 3, 1, 2, 3;
       1, 3, 1, 1, 2, 2,-0.0774, 1, 4, 1, 2, 3;
       2, 3, 1, 2, 2, 1, 0.4733, 4, 5, 1, 2, 3;
       1, 3, 2, 2, 1, 2,-0.0354, 4, 2, 2, 2, 3;
       1, 3, 2, 2, 1, 1,-0.6640, 2, 1, 2, 2, 3;
       2, 3, 1, 3, 1, 1, 0.0335, 4, 4, 2, 2, 3;
       1, 2, 2, 2, 1, 3, 0.1351, 1, 1, 2, 2, 3;
       1, 1, 2, 1, 2, 3,-0.5951, 3, 4, 2, 2, 3;
       2, 2, 2, 3, 1, 3, 0.2735, 3, 2, 1, 3, 3;
       2, 2, 1, 1, 1, 3, 0.3157, 1, 2, 1, 3, 3;
       2, 2, 2, 1, 1, 1,-1.0843, 2, 3, 1, 3, 3;
       1, 2, 2, 1, 2, 2,-0.0836, 4, 2, 1, 3, 3;
       2, 1, 2, 1, 1, 2,-0.2884, 2, 1, 1, 3, 3;
       2, 3, 2, 3, 2, 3,-0.1006, 1, 2, 2, 3, 3;
       1, 3, 1, 2, 2, 3, 0.5710, 1, 3, 2, 3, 3;
       1, 1, 2, 1, 1, 2, 0.2776, 2, 3, 2, 3, 3;
       2, 3, 2, 2, 1, 3,-0.7561, 4, 4, 2, 3, 3;
       1, 2, 2, 2, 1, 2, 1.5549, 1, 4, 2, 3, 3];

vpr = [int64(1):7];
gamma = zeros(8, 1);
gamma(1) = -1; % Estimate initial values for the variance components

% Call the initialisation routine once to get lrcomm and licomm
lrcomm = int64(0);
licomm = int64(2);
[nff, nlsv, nrf, rcomm, icomm, ifail] = ...
  nag_correg_mixeff_hier_init(dat, levels, y, fixed, rndm, lrcomm, licomm);
licomm = icomm(1);
lrcomm = icomm(2);

% Pre-process the data
[nff, nlsv, nrf, rcomm, icomm, ifail] = ...
  nag_correg_mixeff_hier_init(dat, levels, y, fixed, rndm, lrcomm, licomm);

% Use default options
iopt = zeros(0, 0, 'int64');
ropt = zeros(0, 0);

lb = nff + nrf*nlsv;

% Perform the analysis
[gamma, effn, rnkx, ncov, lnlike, id, b, se, czz, cxx, cxz, ifail] = ...
    nag_correg_mixeff_hier_reml(vpr, nvpr, gamma, lb, rcomm, icomm, iopt, ropt);

% Print results
fprintf('\nNumber of observations (N)                    = %d\n', n);
fprintf('Number of random factors (NRF)                = %d\n', nrf);
fprintf('Number of fixed factors (NFF)                 = %d\n', nff);
fprintf('Number of subject levels (NLSV)               = %d\n', nlsv);
fprintf('Rank of X (RNKX)                              = %d\n', rnkx);
fprintf('Effective N (EFFN)                            = %d\n', effn);
fprintf('Number of non-zero variance components (NCOV) = %d\n', ncov);

fprintf('\nParameter Estimates\n');

tdid = double(nff + nrf*nlsv);

if nrf > 0
  fprintf('\nRandom Effects\n');
end
pb = -999;
pfmt = ' ';
for k = 1:double(nrf*nlsv)
  tb = id(1,k);
  if tb ~= -999
    vid = id(2,k);
    nv = rndm(1,tb);
    ns = rndm(3+nv,tb);
    tfmt = sprintf('%d ', id(3+1:3+double(ns),k));
    if ( (pb ~= tb) || (strcmp(tfmt, pfmt) == 0) )
      if (k ~= 1)
        fprintf('\n');
      end
      fprintf('  Subject: ');
      for l=1:double(ns)
        fprintf(' Variable %2d (Level %2d)',rndm(3+nv+l,tb), id(3+l,k));
      end
      fprintf('\n');
    end
    if (vid==0)
      % Intercept
      fprintf('    Intercept                  %10.4f %10.4f\n', b(k), se(k));
    else
      % vid'th variable specified in rndm
      aid = rndm(2+vid,tb);
      if (id(3,k)==0)
        fprintf('    Variable %2d                %10.4f %10.4f\n', aid, b(k), se(k));
      else
        fprintf('    Variable %2d (Level %2d)     %10.4f %10.4f\n', aid, id(3,k), b(k), se(k));
      end
    end
    pfmt = tfmt;
  end
  pb = tb;
end

if nff>0
  fprintf('\nFixed Effects\n');
end
for k = double(nrf*nlsv + 1):tdid
  if vid~=-999
    vid = id(2,k);
    if vid==0
      % Intercept
      fprintf('    Intercept                  %10.4f %10.4f\n', b(k), se(k));
    else
      % vid'th variable specified in fixed
      aid = fixed(2+vid);
      if (id(3,k)==0)
        fprintf('    Variable %2d                %10.4f %10.4f\n', aid, b(k), se(k));
      else
        fprintf('    Variable %2d (Level %2d)     %10.4f %10.4f\n', aid, id(3,k), b(k), se(k));
      end
    end
  end
end

fprintf('\nVariance Components\n');
fprintf('  Estimate     Parameter       Subject\n');
for k = 1:double(nvpr)
  fprintf('%10.5f     ', gamma(k));
  p = 0;
  for tb = 1:double(nrndm)
    nv = double(rndm(1,tb));
    ns = double(rndm(3+nv,tb));
    if (rndm(2,tb)==1)
      p = p + 1;
      if (vpr(p)==k)
        fprintf('Intercept       Variables ');
        fprintf('%2d ', rndm(3+nv+1:3+nv+ns, tb));
        fprintf('\n');
      end
    end
    for i = 1:nv
      p = p + 1;
      if (vpr(p)==k)
       fprintf('Variable %2d     Variables %2d ', rndm(2+i,tb));
       fprintf('%2d ', rndm(3+nv+1:3+nv+ns, tb));
      end
    end
  end
  fprintf('\n');
end

fprintf('\nsigma^2          = %15.5f\n', gamma(nvpr+1));
fprintf('-2log likelihood = %15.5f\n', lnlike);
 

Number of observations (N)                    = 90
Number of random factors (NRF)                = 55
Number of fixed factors (NFF)                 = 4
Number of subject levels (NLSV)               = 3
Rank of X (RNKX)                              = 4
Effective N (EFFN)                            = 90
Number of non-zero variance components (NCOV) = 7

Parameter Estimates

Random Effects
  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  1)
    Variable  3 (Level  1)         2.1561     3.7946
    Variable  3 (Level  2)         1.8951     3.9284
    Variable  4 (Level  1)         0.6496     3.1617

  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  1)
    Variable  4 (Level  3)         0.7390     3.1424

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  1)
    Variable  3 (Level  1)         1.4216     3.3773
    Variable  3 (Level  2)        -2.8921     3.3953
    Variable  4 (Level  1)         3.6789     2.3162
    Variable  4 (Level  2)        -1.9742     2.3887
    Variable  4 (Level  3)        -2.2088     2.0697

  Subject:  Variable 10 (Level  1) Variable 11 (Level  2) Variable 12 (Level  1)
    Variable  3 (Level  1)        -2.9659     3.9127
    Variable  3 (Level  2)         2.7951     4.7183
    Variable  4 (Level  1)        -4.7330     2.3094
    Variable  4 (Level  2)         5.5161     2.2330
    Variable  4 (Level  3)        -0.8417     2.3826

  Subject:  Variable 10 (Level  2) Variable 11 (Level  2) Variable 12 (Level  1)
    Variable  3 (Level  1)         4.2202     3.6675
    Variable  3 (Level  2)        -4.3883     3.4424
    Variable  4 (Level  1)        -1.1391     3.2187
    Variable  4 (Level  2)         1.0814     3.0654

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  3 (Level  1)         0.3391     4.0647
    Variable  3 (Level  2)         0.1502     3.4787
    Variable  4 (Level  1)        -1.0026     2.4363

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  4 (Level  3)         1.1703     2.6365

  Subject:  Variable 10 (Level  2) Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  3 (Level  1)         1.2658     3.4819
    Variable  3 (Level  2)        -1.5356     3.9097

  Subject:  Variable 10 (Level  2) Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  4 (Level  2)         0.7992     2.7902
    Variable  4 (Level  3)        -0.8916     2.8763

  Subject:  Variable 11 (Level  1) Variable 12 (Level  1)
    Variable  5 (Level  1)        -0.4885     2.8206
    Variable  5 (Level  2)         1.8829     2.7530
    Variable  6 (Level  1)         0.9249     3.7747
    Variable  6 (Level  2)        -2.3568     3.1624
    Variable  6 (Level  3)         4.3117     3.1474

  Subject:  Variable 11 (Level  2) Variable 12 (Level  1)
    Variable  5 (Level  1)         1.3898     2.9362
    Variable  5 (Level  2)        -1.5729     2.8909
    Variable  6 (Level  1)         0.2111     3.9967
    Variable  6 (Level  2)        -3.7083     4.2866
    Variable  6 (Level  3)         3.1190     4.7983

  Subject:  Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  5 (Level  1)         1.7352     3.1370
    Variable  5 (Level  2)        -1.6165     3.1713
    Variable  6 (Level  1)        -1.1102     3.9374
    Variable  6 (Level  2)         4.4877     3.6980
    Variable  6 (Level  3)        -3.1325     3.1966

  Subject:  Variable 12 (Level  1)
    Variable  7                    0.6827     0.5060
    Variable  8 (Level  1)         1.5964     1.3206
    Variable  8 (Level  2)        -0.7533     1.5663
    Variable  8 (Level  3)         0.4035     1.6840
    Variable  8 (Level  4)        -0.8523     1.7518
    Variable  9 (Level  1)         0.5699     1.6236
    Variable  9 (Level  2)         0.0012     1.9111
    Variable  9 (Level  3)        -0.2850     1.9245
    Variable  9 (Level  4)         0.4468     2.0329
    Variable  9 (Level  5)         0.0030     2.1390

  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  3 (Level  1)         6.2551     3.3595
    Variable  3 (Level  2)         5.6085     3.4127

  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  4 (Level  2)         2.6922     2.7542
    Variable  4 (Level  3)         1.3742     2.8068

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  3 (Level  1)         1.5647     3.8353
    Variable  3 (Level  2)        -2.7565     3.9041
    Variable  4 (Level  1)        -0.8621     2.8257

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  4 (Level  3)         0.4536     2.8070

  Subject:  Variable 10 (Level  1) Variable 11 (Level  2) Variable 12 (Level  2)
    Variable  3 (Level  1)       -10.1544     3.3433
    Variable  3 (Level  2)         3.2446     4.1221
    Variable  4 (Level  1)        -2.9419     2.3508
    Variable  4 (Level  2)         0.2510     3.0675
    Variable  4 (Level  3)         0.3224     2.9710

  Subject:  Variable 10 (Level  2) Variable 11 (Level  2) Variable 12 (Level  2)
    Variable  3 (Level  1)        -1.3577     3.1925
    Variable  3 (Level  2)         8.1277     3.9975
    Variable  4 (Level  1)        -0.4290     2.4578
    Variable  4 (Level  2)         2.7495     2.5821

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  2)
    Variable  3 (Level  1)         4.8432     4.0069
    Variable  3 (Level  2)         0.0370     3.6006
    Variable  4 (Level  1)         3.0713     2.2706
    Variable  4 (Level  2)        -1.8899     2.4756
    Variable  4 (Level  3)         0.4914     2.2914

  Subject:  Variable 10 (Level  2) Variable 11 (Level  3) Variable 12 (Level  2)
    Variable  3 (Level  1)        -4.4766     3.3355
    Variable  3 (Level  2)        -3.7936     4.0759
    Variable  4 (Level  1)        -0.5459     2.7097
    Variable  4 (Level  2)        -1.5619     2.7412
    Variable  4 (Level  3)        -0.7269     2.9735

  Subject:  Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  5 (Level  1)         4.8653     3.0706
    Variable  5 (Level  2)         0.9011     3.0696
    Variable  6 (Level  1)         6.9277     3.8411
    Variable  6 (Level  2)        -1.3108     3.1667
    Variable  6 (Level  3)         6.2916     3.5327

  Subject:  Variable 11 (Level  2) Variable 12 (Level  2)
    Variable  5 (Level  1)        -0.4047     3.0956
    Variable  5 (Level  2)         0.3291     3.0784
    Variable  6 (Level  1)         6.9096     3.3073
    Variable  6 (Level  2)        -1.0680     3.6213
    Variable  6 (Level  3)        -5.9977     3.7299

  Subject:  Variable 11 (Level  3) Variable 12 (Level  2)
    Variable  5 (Level  1)        -1.0925     3.0994
    Variable  5 (Level  2)        -0.7392     2.9900
    Variable  6 (Level  1)         2.7758     3.8748
    Variable  6 (Level  2)        -6.3526     3.3014
    Variable  6 (Level  3)        -0.2060     3.6481

  Subject:  Variable 12 (Level  2)
    Variable  7                    0.1711     0.5785
    Variable  8 (Level  1)         1.7186     1.9143
    Variable  8 (Level  2)        -0.6768     1.7352
    Variable  8 (Level  3)        -0.0439     1.6395
    Variable  8 (Level  4)         0.1463     1.5358
    Variable  9 (Level  1)         0.9761     2.3930
    Variable  9 (Level  2)         6.5436     1.8193
    Variable  9 (Level  3)        -1.5504     1.8527
    Variable  9 (Level  4)         0.1047     2.0244
    Variable  9 (Level  5)        -3.9386     1.7937

  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  3)
    Variable  3 (Level  1)        10.6802     3.2596
    Variable  3 (Level  2)        -1.0290     3.7842
    Variable  4 (Level  1)        -2.8612     2.2917
    Variable  4 (Level  2)         3.9265     2.8934
    Variable  4 (Level  3)         2.2427     2.3737

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  3)
    Variable  3 (Level  1)        -6.2076     3.3642
    Variable  3 (Level  2)        -8.7670     3.8463
    Variable  4 (Level  1)        -2.9251     2.4657

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  3)
    Variable  4 (Level  3)        -2.2077     2.3743

  Subject:  Variable 10 (Level  1) Variable 11 (Level  2) Variable 12 (Level  3)
    Variable  3 (Level  1)        -3.3334     3.4665
    Variable  3 (Level  2)        -0.3111     3.2650
    Variable  4 (Level  1)         1.5131     2.4890
    Variable  4 (Level  2)        -3.0345     3.0562
    Variable  4 (Level  3)         0.2722     2.8300

  Subject:  Variable 10 (Level  2) Variable 11 (Level  2) Variable 12 (Level  3)
    Variable  3 (Level  1)         6.5905     4.0386
    Variable  3 (Level  2)        -5.3168     3.4549
    Variable  4 (Level  1)        -3.5280     2.9663
    Variable  4 (Level  2)         1.7056     2.9293
    Variable  4 (Level  3)         2.2590     3.1780

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  3)
    Variable  3 (Level  1)         8.1889     4.1429
    Variable  3 (Level  2)        -1.5388     3.3333
    Variable  4 (Level  1)         3.4338     2.6376

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  3)
    Variable  4 (Level  3)        -1.1544     2.9885

  Subject:  Variable 10 (Level  2) Variable 11 (Level  3) Variable 12 (Level  3)
    Variable  3 (Level  1)        -4.4243     4.0049
    Variable  3 (Level  2)        -4.1349     3.1248
    Variable  4 (Level  1)         1.0460     2.6550
    Variable  4 (Level  2)        -4.4844     2.2843
    Variable  4 (Level  3)         0.5046     2.6926

  Subject:  Variable 11 (Level  1) Variable 12 (Level  3)
    Variable  5 (Level  1)         5.3030     3.0278
    Variable  5 (Level  2)        -8.1794     3.1335
    Variable  6 (Level  1)        -0.8188     3.7810
    Variable  6 (Level  2)        -2.5078     3.1514
    Variable  6 (Level  3)        -2.6138     3.4600

  Subject:  Variable 11 (Level  2) Variable 12 (Level  3)
    Variable  5 (Level  1)         4.3331     3.1489
    Variable  5 (Level  2)        -5.6142     3.1649
    Variable  6 (Level  1)        -5.8804     3.1770
    Variable  6 (Level  2)         5.4265     3.3006
    Variable  6 (Level  3)        -2.1917     3.2156

  Subject:  Variable 11 (Level  3) Variable 12 (Level  3)
    Variable  5 (Level  1)         0.4305     2.9144
    Variable  5 (Level  2)        -1.4620     3.0119
    Variable  6 (Level  1)        14.3595     3.9254
    Variable  6 (Level  2)        -5.2399     3.3099
    Variable  6 (Level  3)       -11.2498     3.2212

  Subject:  Variable 12 (Level  3)
    Variable  7                   -0.3839     0.6755
    Variable  8 (Level  1)         2.7549     1.6017
    Variable  8 (Level  2)         0.4377     1.8826
    Variable  8 (Level  3)        -0.2261     1.9909
    Variable  8 (Level  4)        -4.5051     1.5398
    Variable  9 (Level  1)        -4.7091     2.1458
    Variable  9 (Level  2)         3.7940     1.9872
    Variable  9 (Level  3)        -1.7994     1.8614
    Variable  9 (Level  4)         0.4480     1.9016
    Variable  9 (Level  5)        -0.6047     2.4729

Fixed Effects
    Intercept                      1.6433     2.4596
    Variable  1 (Level  2)        -1.6224     0.8549
    Variable  2 (Level  2)        -2.4817     1.1414
    Variable  2 (Level  3)         0.4624     1.2133

Variance Components
  Estimate     Parameter       Subject
  36.32491     Variable  3     Variables 10 11 12 
  12.45090     Variable  4     Variables 10 11 12 
  19.62767     Variable  5     Variables 11 12 
  40.53480     Variable  6     Variables 11 12 
   0.56320     Variable  7     Variables 12 
   5.81968     Variable  8     Variables 12 
  10.86069     Variable  9     Variables 12 

sigma^2          =         0.00239
-2log likelihood =       608.19449

function g02jd_example
% Problem size
n = 90;
ncol = 12;
nrndm = 3;
nvpr = int64(7);

% The number of levels associated with each of the independent variables
levels = [int64(2); 3; 2; 3; 2; 3; 1; 4; 5; 2; 3; 3];

% The Fixed part of the model
fixed = [int64(2); 1; 1; 2];

% The Random part of the model
rndm = [int64(2), 2,  3;
              0,  0,  0;
              3,  5,  7;
              4,  6,  8;
              3,  2,  9;
             10, 11,  1;
             11, 12, 12;
             12,  0,  0];

% Dependant data
y = [3.1100; 2.8226; 7.4543; 4.4313; 6.1543; -0.1783; 4.6748; 7.0667;
     1.4262; 7.7290; -2.1806; 6.8419; 1.2590; 8.8405; 6.1657; -4.5605;
    -1.2367; -12.2932; -2.3374; 0.0716; 0.1895; 1.5608; -0.8529; -4.1169;
     3.9977; -8.1277; -4.9656; -0.6428; -5.5152; -5.5657; 14.8177; 16.9783;
    13.8966; 14.8166; 19.3640; 9.5299; 12.0102; 6.1551; -1.7048; 2.7640;
     2.8065; 0.0974; -7.8080; -18.0450; -2.8199; 8.9893; 3.7978; -6.3493;
     8.1411; -7.5483; -0.4600; -3.2135; -6.6562; 5.1267; 3.5592; -4.4420;
    -8.5965; -6.3187; -7.8953; -10.1383; -7.8850; 23.2001; 5.5829; -4.3698;
     2.1274; -2.7184; -17.9128; -1.2708; -24.2735; -14.7374; 0.1713; 8.0006;
     1.2100; 3.3307; -22.6713; 7.5562; -7.0694; 3.7159; -4.3135; -14.5577;
   -12.5107; 4.7708; 13.2797; -6.3243; -7.0549; -9.2713; -18.7788; -7.7230;
   -22.7230; -11.6609];

% Independent data
dat = [1, 3, 2, 1, 2, 2,-0.3160, 4, 2, 1, 1, 1;
       1, 1, 1, 3, 1, 2,-1.3377, 1, 4, 1, 1, 1;
       1, 3, 1, 3, 1, 3,-0.7610, 4, 2, 1, 1, 1;
       2, 3, 2, 1, 1, 3,-2.2976, 4, 2, 1, 1, 1;
       2, 2, 1, 3, 2, 3,-0.4263, 2, 1, 1, 1, 1;
       2, 1, 2, 3, 1, 3, 1.4067, 3, 3, 2, 1, 1;
       2, 3, 2, 1, 2, 1,-1.4669, 1, 2, 2, 1, 1;
       1, 1, 1, 3, 2, 3, 0.4717, 2, 4, 2, 1, 1;
       1, 3, 2, 3, 2, 1, 0.4436, 1, 3, 2, 1, 1;
       1, 1, 1, 2, 2, 3,-0.5950, 3, 4, 2, 1, 1;
       1, 3, 1, 3, 1, 1,-1.7981, 4, 2, 1, 2, 1;
       2, 3, 1, 2, 1, 1, 0.2397, 1, 4, 1, 2, 1;
       1, 2, 2, 1, 2, 3, 0.4742, 1, 1, 1, 2, 1;
       2, 2, 2, 2, 2, 3, 0.6888, 3, 1, 1, 2, 1;
       2, 1, 2, 3, 1, 3,-1.0616, 3, 5, 1, 2, 1;
       1, 2, 2, 2, 2, 1,-0.5356, 1, 3, 2, 2, 1;
       1, 3, 2, 2, 1, 1,-1.2963, 2, 5, 2, 2, 1;
       1, 2, 2, 1, 2, 2,-1.5389, 3, 2, 2, 2, 1;
       2, 3, 1, 1, 2, 2,-0.6408, 2, 1, 2, 2, 1;
       1, 2, 2, 2, 1, 1, 0.6574, 1, 1, 2, 2, 1;
       2, 1, 1, 1, 1, 3, 0.9259, 1, 2, 1, 3, 1;
       2, 2, 2, 1, 2, 2, 1.5080, 3, 1, 1, 3, 1;
       2, 3, 1, 1, 1, 3, 2.5821, 2, 3, 1, 3, 1;
       1, 2, 2, 1, 2, 3, 0.4102, 1, 4, 1, 3, 1;
       2, 1, 2, 3, 2, 2, 0.7839, 2, 5, 1, 3, 1;
       1, 2, 2, 3, 2, 1,-1.8812, 4, 2, 2, 3, 1;
       1, 2, 1, 3, 2, 3, 0.7770, 4, 1, 2, 3, 1;
       2, 2, 1, 2, 1, 3, 0.2590, 3, 1, 2, 3, 1;
       2, 3, 2, 2, 2, 3,-0.9250, 3, 3, 2, 3, 1;
       2, 2, 1, 3, 2, 3,-0.4831, 1, 5, 2, 3, 1;
       2, 2, 1, 3, 1, 3, 0.5046, 3, 3, 1, 1, 2;
       2, 1, 1, 2, 2, 1,-0.6903, 2, 1, 1, 1, 2;
       1, 3, 2, 2, 2, 1, 1.6166, 2, 5, 1, 1, 2;
       2, 2, 2, 2, 1, 3, 0.2778, 2, 3, 1, 1, 2;
       2, 3, 2, 2, 1, 2, 1.9586, 4, 2, 1, 1, 2;
       1, 3, 1, 1, 1, 3, 1.0506, 2, 5, 2, 1, 2;
       2, 1, 1, 3, 2, 3, 0.4871, 1, 1, 2, 1, 2;
       2, 1, 2, 3, 2, 1, 2.0891, 4, 4, 2, 1, 2;
       1, 2, 1, 1, 2, 2, 1.4338, 4, 3, 2, 1, 2;
       1, 1, 2, 3, 1, 2,-1.1196, 3, 4, 2, 1, 2;
       1, 3, 1, 1, 2, 1, 0.3367, 3, 2, 1, 2, 2;
       2, 2, 1, 3, 1, 1, 0.1092, 2, 2, 1, 2, 2;
       1, 1, 1, 2, 2, 2, 0.4007, 4, 1, 1, 2, 2;
       2, 3, 1, 1, 1, 2, 0.1460, 3, 5, 1, 2, 2;
       2, 1, 2, 3, 1, 3,-0.3877, 3, 4, 1, 2, 2;
       1, 1, 1, 2, 2, 1, 0.6957, 4, 3, 2, 2, 2;
       2, 1, 1, 1, 2, 1,-0.4664, 3, 3, 2, 2, 2;
       1, 1, 1, 1, 2, 3, 0.2067, 2, 4, 2, 2, 2;
       2, 1, 2, 1, 1, 2, 0.4112, 1, 4, 2, 2, 2;
       2, 2, 1, 1, 1, 2,-1.3734, 3, 3, 2, 2, 2;
       2, 1, 2, 3, 1, 3, 0.7065, 1, 3, 1, 3, 2;
       1, 2, 2, 2, 1, 2, 1.3628, 4, 2, 1, 3, 2;
       2, 1, 2, 2, 2, 3,-0.5052, 4, 5, 1, 3, 2;
       2, 1, 1, 1, 2, 1,-1.3457, 2, 5, 1, 3, 2;
       1, 1, 2, 1, 2, 3,-1.8022, 3, 4, 1, 3, 2;
       2, 3, 1, 2, 1, 1, 0.0116, 2, 4, 2, 3, 2;
       2, 2, 1, 3, 2, 3,-0.9075, 1, 3, 2, 3, 2;
       2, 2, 2, 2, 2, 3,-1.4707, 1, 1, 2, 3, 2;
       2, 2, 1, 1, 2, 1,-1.2938, 2, 3, 2, 3, 2;
       1, 3, 1, 3, 2, 2,-1.1660, 4, 4, 2, 3, 2;
       1, 2, 1, 1, 2, 3, 0.0397, 4, 4, 1, 1, 3;
       1, 3, 1, 2, 1, 3,-0.5987, 3, 2, 1, 1, 3;
       2, 3, 2, 2, 1, 1, 0.6683, 3, 3, 1, 1, 3;
       2, 2, 1, 1, 2, 2,-0.0106, 1, 3, 1, 1, 3;
       1, 2, 1, 3, 2, 2, 0.5885, 1, 3, 1, 1, 3;
       1, 1, 1, 1, 1, 2, 0.4555, 1, 5, 2, 1, 3;
       2, 2, 2, 1, 1, 2, 0.6502, 4, 3, 2, 1, 3;
       1, 1, 1, 3, 1, 1,-0.1601, 1, 3, 2, 1, 3;
       2, 2, 1, 3, 2, 3, 1.6910, 1, 1, 2, 1, 3;
       2, 2, 2, 3, 1, 2, 0.1053, 4, 4, 2, 1, 3;
       2, 1, 2, 3, 2, 2,-0.4037, 3, 4, 1, 2, 3;
       1, 3, 2, 3, 1, 3,-0.5853, 3, 2, 1, 2, 3;
       2, 3, 2, 1, 1, 1,-0.3037, 1, 3, 1, 2, 3;
       1, 3, 1, 1, 2, 2,-0.0774, 1, 4, 1, 2, 3;
       2, 3, 1, 2, 2, 1, 0.4733, 4, 5, 1, 2, 3;
       1, 3, 2, 2, 1, 2,-0.0354, 4, 2, 2, 2, 3;
       1, 3, 2, 2, 1, 1,-0.6640, 2, 1, 2, 2, 3;
       2, 3, 1, 3, 1, 1, 0.0335, 4, 4, 2, 2, 3;
       1, 2, 2, 2, 1, 3, 0.1351, 1, 1, 2, 2, 3;
       1, 1, 2, 1, 2, 3,-0.5951, 3, 4, 2, 2, 3;
       2, 2, 2, 3, 1, 3, 0.2735, 3, 2, 1, 3, 3;
       2, 2, 1, 1, 1, 3, 0.3157, 1, 2, 1, 3, 3;
       2, 2, 2, 1, 1, 1,-1.0843, 2, 3, 1, 3, 3;
       1, 2, 2, 1, 2, 2,-0.0836, 4, 2, 1, 3, 3;
       2, 1, 2, 1, 1, 2,-0.2884, 2, 1, 1, 3, 3;
       2, 3, 2, 3, 2, 3,-0.1006, 1, 2, 2, 3, 3;
       1, 3, 1, 2, 2, 3, 0.5710, 1, 3, 2, 3, 3;
       1, 1, 2, 1, 1, 2, 0.2776, 2, 3, 2, 3, 3;
       2, 3, 2, 2, 1, 3,-0.7561, 4, 4, 2, 3, 3;
       1, 2, 2, 2, 1, 2, 1.5549, 1, 4, 2, 3, 3];

vpr = [int64(1):7];
gamma = zeros(8, 1);
gamma(1) = -1; % Estimate initial values for the variance components

% Call the initialisation routine once to get lrcomm and licomm
lrcomm = int64(0);
licomm = int64(2);
[nff, nlsv, nrf, rcomm, icomm, ifail] = ...
  g02jc(dat, levels, y, fixed, rndm, lrcomm, licomm);
licomm = icomm(1);
lrcomm = icomm(2);

% Pre-process the data
[nff, nlsv, nrf, rcomm, icomm, ifail] = ...
  g02jc(dat, levels, y, fixed, rndm, lrcomm, licomm);

% Use default options
iopt = zeros(0, 0, 'int64');
ropt = zeros(0, 0);

lb = nff + nrf*nlsv;

% Perform the analysis
[gamma, effn, rnkx, ncov, lnlike, id, b, se, czz, cxx, cxz, ifail] = ...
    g02jd(vpr, nvpr, gamma, lb, rcomm, icomm, iopt, ropt);

% Print results
fprintf('\nNumber of observations (N)                    = %d\n', n);
fprintf('Number of random factors (NRF)                = %d\n', nrf);
fprintf('Number of fixed factors (NFF)                 = %d\n', nff);
fprintf('Number of subject levels (NLSV)               = %d\n', nlsv);
fprintf('Rank of X (RNKX)                              = %d\n', rnkx);
fprintf('Effective N (EFFN)                            = %d\n', effn);
fprintf('Number of non-zero variance components (NCOV) = %d\n', ncov);

fprintf('\nParameter Estimates\n');

tdid = double(nff + nrf*nlsv);

if nrf > 0
  fprintf('\nRandom Effects\n');
end
pb = -999;
pfmt = ' ';
for k = 1:double(nrf*nlsv)
  tb = id(1,k);
  if tb ~= -999
    vid = id(2,k);
    nv = rndm(1,tb);
    ns = rndm(3+nv,tb);
    tfmt = sprintf('%d ', id(3+1:3+double(ns),k));
    if ( (pb ~= tb) || (strcmp(tfmt, pfmt) == 0) )
      if (k ~= 1)
        fprintf('\n');
      end
      fprintf('  Subject: ');
      for l=1:double(ns)
        fprintf(' Variable %2d (Level %2d)',rndm(3+nv+l,tb), id(3+l,k));
      end
      fprintf('\n');
    end
    if (vid==0)
      % Intercept
      fprintf('    Intercept                  %10.4f %10.4f\n', b(k), se(k));
    else
      % vid'th variable specified in rndm
      aid = rndm(2+vid,tb);
      if (id(3,k)==0)
        fprintf('    Variable %2d                %10.4f %10.4f\n', aid, b(k), se(k));
      else
        fprintf('    Variable %2d (Level %2d)     %10.4f %10.4f\n', aid, id(3,k), b(k), se(k));
      end
    end
    pfmt = tfmt;
  end
  pb = tb;
end

if nff>0
  fprintf('\nFixed Effects\n');
end
for k = double(nrf*nlsv + 1):tdid
  if vid~=-999
    vid = id(2,k);
    if vid==0
      % Intercept
      fprintf('    Intercept                  %10.4f %10.4f\n', b(k), se(k));
    else
      % vid'th variable specified in fixed
      aid = fixed(2+vid);
      if (id(3,k)==0)
        fprintf('    Variable %2d                %10.4f %10.4f\n', aid, b(k), se(k));
      else
        fprintf('    Variable %2d (Level %2d)     %10.4f %10.4f\n', aid, id(3,k), b(k), se(k));
      end
    end
  end
end

fprintf('\nVariance Components\n');
fprintf('  Estimate     Parameter       Subject\n');
for k = 1:double(nvpr)
  fprintf('%10.5f     ', gamma(k));
  p = 0;
  for tb = 1:double(nrndm)
    nv = double(rndm(1,tb));
    ns = double(rndm(3+nv,tb));
    if (rndm(2,tb)==1)
      p = p + 1;
      if (vpr(p)==k)
        fprintf('Intercept       Variables ');
        fprintf('%2d ', rndm(3+nv+1:3+nv+ns, tb));
        fprintf('\n');
      end
    end
    for i = 1:nv
      p = p + 1;
      if (vpr(p)==k)
       fprintf('Variable %2d     Variables %2d ', rndm(2+i,tb));
       fprintf('%2d ', rndm(3+nv+1:3+nv+ns, tb));
      end
    end
  end
  fprintf('\n');
end

fprintf('\nsigma^2          = %15.5f\n', gamma(nvpr+1));
fprintf('-2log likelihood = %15.5f\n', lnlike);
 

Number of observations (N)                    = 90
Number of random factors (NRF)                = 55
Number of fixed factors (NFF)                 = 4
Number of subject levels (NLSV)               = 3
Rank of X (RNKX)                              = 4
Effective N (EFFN)                            = 90
Number of non-zero variance components (NCOV) = 7

Parameter Estimates

Random Effects
  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  1)
    Variable  3 (Level  1)         2.1561     3.7946
    Variable  3 (Level  2)         1.8951     3.9284
    Variable  4 (Level  1)         0.6496     3.1617

  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  1)
    Variable  4 (Level  3)         0.7390     3.1424

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  1)
    Variable  3 (Level  1)         1.4216     3.3773
    Variable  3 (Level  2)        -2.8921     3.3953
    Variable  4 (Level  1)         3.6789     2.3162
    Variable  4 (Level  2)        -1.9742     2.3887
    Variable  4 (Level  3)        -2.2088     2.0697

  Subject:  Variable 10 (Level  1) Variable 11 (Level  2) Variable 12 (Level  1)
    Variable  3 (Level  1)        -2.9659     3.9127
    Variable  3 (Level  2)         2.7951     4.7183
    Variable  4 (Level  1)        -4.7330     2.3094
    Variable  4 (Level  2)         5.5161     2.2330
    Variable  4 (Level  3)        -0.8417     2.3826

  Subject:  Variable 10 (Level  2) Variable 11 (Level  2) Variable 12 (Level  1)
    Variable  3 (Level  1)         4.2202     3.6675
    Variable  3 (Level  2)        -4.3883     3.4424
    Variable  4 (Level  1)        -1.1391     3.2187
    Variable  4 (Level  2)         1.0814     3.0654

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  3 (Level  1)         0.3391     4.0647
    Variable  3 (Level  2)         0.1502     3.4787
    Variable  4 (Level  1)        -1.0026     2.4363

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  4 (Level  3)         1.1703     2.6365

  Subject:  Variable 10 (Level  2) Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  3 (Level  1)         1.2658     3.4819
    Variable  3 (Level  2)        -1.5356     3.9097

  Subject:  Variable 10 (Level  2) Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  4 (Level  2)         0.7992     2.7902
    Variable  4 (Level  3)        -0.8916     2.8763

  Subject:  Variable 11 (Level  1) Variable 12 (Level  1)
    Variable  5 (Level  1)        -0.4885     2.8206
    Variable  5 (Level  2)         1.8829     2.7530
    Variable  6 (Level  1)         0.9249     3.7747
    Variable  6 (Level  2)        -2.3568     3.1624
    Variable  6 (Level  3)         4.3117     3.1474

  Subject:  Variable 11 (Level  2) Variable 12 (Level  1)
    Variable  5 (Level  1)         1.3898     2.9362
    Variable  5 (Level  2)        -1.5729     2.8909
    Variable  6 (Level  1)         0.2111     3.9967
    Variable  6 (Level  2)        -3.7083     4.2866
    Variable  6 (Level  3)         3.1190     4.7983

  Subject:  Variable 11 (Level  3) Variable 12 (Level  1)
    Variable  5 (Level  1)         1.7352     3.1370
    Variable  5 (Level  2)        -1.6165     3.1713
    Variable  6 (Level  1)        -1.1102     3.9374
    Variable  6 (Level  2)         4.4877     3.6980
    Variable  6 (Level  3)        -3.1325     3.1966

  Subject:  Variable 12 (Level  1)
    Variable  7                    0.6827     0.5060
    Variable  8 (Level  1)         1.5964     1.3206
    Variable  8 (Level  2)        -0.7533     1.5663
    Variable  8 (Level  3)         0.4035     1.6840
    Variable  8 (Level  4)        -0.8523     1.7518
    Variable  9 (Level  1)         0.5699     1.6236
    Variable  9 (Level  2)         0.0012     1.9111
    Variable  9 (Level  3)        -0.2850     1.9245
    Variable  9 (Level  4)         0.4468     2.0329
    Variable  9 (Level  5)         0.0030     2.1390

  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  3 (Level  1)         6.2551     3.3595
    Variable  3 (Level  2)         5.6085     3.4127

  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  4 (Level  2)         2.6922     2.7542
    Variable  4 (Level  3)         1.3742     2.8068

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  3 (Level  1)         1.5647     3.8353
    Variable  3 (Level  2)        -2.7565     3.9041
    Variable  4 (Level  1)        -0.8621     2.8257

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  4 (Level  3)         0.4536     2.8070

  Subject:  Variable 10 (Level  1) Variable 11 (Level  2) Variable 12 (Level  2)
    Variable  3 (Level  1)       -10.1544     3.3433
    Variable  3 (Level  2)         3.2446     4.1221
    Variable  4 (Level  1)        -2.9419     2.3508
    Variable  4 (Level  2)         0.2510     3.0675
    Variable  4 (Level  3)         0.3224     2.9710

  Subject:  Variable 10 (Level  2) Variable 11 (Level  2) Variable 12 (Level  2)
    Variable  3 (Level  1)        -1.3577     3.1925
    Variable  3 (Level  2)         8.1277     3.9975
    Variable  4 (Level  1)        -0.4290     2.4578
    Variable  4 (Level  2)         2.7495     2.5821

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  2)
    Variable  3 (Level  1)         4.8432     4.0069
    Variable  3 (Level  2)         0.0370     3.6006
    Variable  4 (Level  1)         3.0713     2.2706
    Variable  4 (Level  2)        -1.8899     2.4756
    Variable  4 (Level  3)         0.4914     2.2914

  Subject:  Variable 10 (Level  2) Variable 11 (Level  3) Variable 12 (Level  2)
    Variable  3 (Level  1)        -4.4766     3.3355
    Variable  3 (Level  2)        -3.7936     4.0759
    Variable  4 (Level  1)        -0.5459     2.7097
    Variable  4 (Level  2)        -1.5619     2.7412
    Variable  4 (Level  3)        -0.7269     2.9735

  Subject:  Variable 11 (Level  1) Variable 12 (Level  2)
    Variable  5 (Level  1)         4.8653     3.0706
    Variable  5 (Level  2)         0.9011     3.0696
    Variable  6 (Level  1)         6.9277     3.8411
    Variable  6 (Level  2)        -1.3108     3.1667
    Variable  6 (Level  3)         6.2916     3.5327

  Subject:  Variable 11 (Level  2) Variable 12 (Level  2)
    Variable  5 (Level  1)        -0.4047     3.0956
    Variable  5 (Level  2)         0.3291     3.0784
    Variable  6 (Level  1)         6.9096     3.3073
    Variable  6 (Level  2)        -1.0680     3.6213
    Variable  6 (Level  3)        -5.9977     3.7299

  Subject:  Variable 11 (Level  3) Variable 12 (Level  2)
    Variable  5 (Level  1)        -1.0925     3.0994
    Variable  5 (Level  2)        -0.7392     2.9900
    Variable  6 (Level  1)         2.7758     3.8748
    Variable  6 (Level  2)        -6.3526     3.3014
    Variable  6 (Level  3)        -0.2060     3.6481

  Subject:  Variable 12 (Level  2)
    Variable  7                    0.1711     0.5785
    Variable  8 (Level  1)         1.7186     1.9143
    Variable  8 (Level  2)        -0.6768     1.7352
    Variable  8 (Level  3)        -0.0439     1.6395
    Variable  8 (Level  4)         0.1463     1.5358
    Variable  9 (Level  1)         0.9761     2.3930
    Variable  9 (Level  2)         6.5436     1.8193
    Variable  9 (Level  3)        -1.5504     1.8527
    Variable  9 (Level  4)         0.1047     2.0244
    Variable  9 (Level  5)        -3.9386     1.7937

  Subject:  Variable 10 (Level  1) Variable 11 (Level  1) Variable 12 (Level  3)
    Variable  3 (Level  1)        10.6802     3.2596
    Variable  3 (Level  2)        -1.0290     3.7842
    Variable  4 (Level  1)        -2.8612     2.2917
    Variable  4 (Level  2)         3.9265     2.8934
    Variable  4 (Level  3)         2.2427     2.3737

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  3)
    Variable  3 (Level  1)        -6.2076     3.3642
    Variable  3 (Level  2)        -8.7670     3.8463
    Variable  4 (Level  1)        -2.9251     2.4657

  Subject:  Variable 10 (Level  2) Variable 11 (Level  1) Variable 12 (Level  3)
    Variable  4 (Level  3)        -2.2077     2.3743

  Subject:  Variable 10 (Level  1) Variable 11 (Level  2) Variable 12 (Level  3)
    Variable  3 (Level  1)        -3.3334     3.4665
    Variable  3 (Level  2)        -0.3111     3.2650
    Variable  4 (Level  1)         1.5131     2.4890
    Variable  4 (Level  2)        -3.0345     3.0562
    Variable  4 (Level  3)         0.2722     2.8300

  Subject:  Variable 10 (Level  2) Variable 11 (Level  2) Variable 12 (Level  3)
    Variable  3 (Level  1)         6.5905     4.0386
    Variable  3 (Level  2)        -5.3168     3.4549
    Variable  4 (Level  1)        -3.5280     2.9663
    Variable  4 (Level  2)         1.7056     2.9293
    Variable  4 (Level  3)         2.2590     3.1780

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  3)
    Variable  3 (Level  1)         8.1889     4.1429
    Variable  3 (Level  2)        -1.5388     3.3333
    Variable  4 (Level  1)         3.4338     2.6376

  Subject:  Variable 10 (Level  1) Variable 11 (Level  3) Variable 12 (Level  3)
    Variable  4 (Level  3)        -1.1544     2.9885

  Subject:  Variable 10 (Level  2) Variable 11 (Level  3) Variable 12 (Level  3)
    Variable  3 (Level  1)        -4.4243     4.0049
    Variable  3 (Level  2)        -4.1349     3.1248
    Variable  4 (Level  1)         1.0460     2.6550
    Variable  4 (Level  2)        -4.4844     2.2843
    Variable  4 (Level  3)         0.5046     2.6926

  Subject:  Variable 11 (Level  1) Variable 12 (Level  3)
    Variable  5 (Level  1)         5.3030     3.0278
    Variable  5 (Level  2)        -8.1794     3.1335
    Variable  6 (Level  1)        -0.8188     3.7810
    Variable  6 (Level  2)        -2.5078     3.1514
    Variable  6 (Level  3)        -2.6138     3.4600

  Subject:  Variable 11 (Level  2) Variable 12 (Level  3)
    Variable  5 (Level  1)         4.3331     3.1489
    Variable  5 (Level  2)        -5.6142     3.1649
    Variable  6 (Level  1)        -5.8804     3.1770
    Variable  6 (Level  2)         5.4265     3.3006
    Variable  6 (Level  3)        -2.1917     3.2156

  Subject:  Variable 11 (Level  3) Variable 12 (Level  3)
    Variable  5 (Level  1)         0.4305     2.9144
    Variable  5 (Level  2)        -1.4620     3.0119
    Variable  6 (Level  1)        14.3595     3.9254
    Variable  6 (Level  2)        -5.2399     3.3099
    Variable  6 (Level  3)       -11.2498     3.2212

  Subject:  Variable 12 (Level  3)
    Variable  7                   -0.3839     0.6755
    Variable  8 (Level  1)         2.7549     1.6017
    Variable  8 (Level  2)         0.4377     1.8826
    Variable  8 (Level  3)        -0.2261     1.9909
    Variable  8 (Level  4)        -4.5051     1.5398
    Variable  9 (Level  1)        -4.7091     2.1458
    Variable  9 (Level  2)         3.7940     1.9872
    Variable  9 (Level  3)        -1.7994     1.8614
    Variable  9 (Level  4)         0.4480     1.9016
    Variable  9 (Level  5)        -0.6047     2.4729

Fixed Effects
    Intercept                      1.6433     2.4596
    Variable  1 (Level  2)        -1.6224     0.8549
    Variable  2 (Level  2)        -2.4817     1.1414
    Variable  2 (Level  3)         0.4624     1.2133

Variance Components
  Estimate     Parameter       Subject
  36.32491     Variable  3     Variables 10 11 12 
  12.45090     Variable  4     Variables 10 11 12 
  19.62767     Variable  5     Variables 11 12 
  40.53480     Variable  6     Variables 11 12 
   0.56320     Variable  7     Variables 12 
   5.81968     Variable  8     Variables 12 
  10.86069     Variable  9     Variables 12 

sigma^2          =         0.00239
-2log likelihood =       608.19449


PDF version (NAG web site, 64-bit version, 64-bit version)
Chapter Contents
Chapter Introduction
NAG Toolbox

© The Numerical Algorithms Group Ltd, Oxford, UK. 2009–2013