Integer type:  int32  int64  nag_int  show int32  show int32  show int64  show int64  show nag_int  show nag_int

Chapter Contents
Chapter Introduction
NAG Toolbox

## Purpose

nag_correg_linregm_rssq (g02ea) calculates the residual sums of squares for all possible linear regressions for a given set of independent variables.

## Syntax

[nmod, modl, rss, nterms, mrank, ifail] = g02ea(mean, x, vname, isx, y, 'n', n, 'm', m, 'wt', wt)
[nmod, modl, rss, nterms, mrank, ifail] = nag_correg_linregm_rssq(mean, x, vname, isx, y, 'n', n, 'm', m, 'wt', wt)

## Description

For a set of k$\mathit{k}$ possible independent variables there are 2k${2}^{\mathit{k}}$ linear regression models with from zero to k$\mathit{k}$ independent variables in each model. For example if k = 3$\mathit{k}=3$ and the variables are A$A$, B$B$ and C$C$ then the possible models are:
 (i) null model (ii) A$A$ (iii) B$B$ (iv) C$C$ (v) A$A$ and B$B$ (vi) A$A$ and C$C$ (vii) B$B$ and C$C$ (viii) A$A$, B$B$ and C$C$.
nag_correg_linregm_rssq (g02ea) calculates the residual sums of squares from each of the 2k${2}^{\mathit{k}}$ possible models. The method used involves a QR$QR$ decomposition of the matrix of possible independent variables. Independent variables are then moved into and out of the model by a series of Givens rotations and the residual sums of squares computed for each model; see Clark (1981) and Smith and Bremner (1989).
The computed residual sums of squares are then ordered first by increasing number of terms in the model, then by decreasing size of residual sums of squares. So the first model will always have the largest residual sum of squares and the 2k${2}^{\mathit{k}}$th will always have the smallest. This aids you in selecting the best possible model from the given set of independent variables.
nag_correg_linregm_rssq (g02ea) allows you to specify some independent variables that must be in the model, the forced variables. The other independent variables from which the possible models are to be formed are the free variables.

## References

Clark M R B (1981) A Givens algorithm for moving from one linear model to another without going back to the data Appl. Statist. 30 198–203
Smith D M and Bremner J M (1989) All possible subset regressions using the QR$QR$ decomposition Comput. Statist. Data Anal. 7 217–236
Weisberg S (1985) Applied Linear Regression Wiley

## Parameters

### Compulsory Input Parameters

1:     mean – string (length ≥ 1)
Indicates if a mean term is to be included.
mean = 'M'${\mathbf{mean}}=\text{'M'}$
A mean term, intercept, will be included in the model.
mean = 'Z'${\mathbf{mean}}=\text{'Z'}$
The model will pass through the origin, zero-point.
Constraint: mean = 'M'${\mathbf{mean}}=\text{'M'}$ or 'Z'$\text{'Z'}$.
2:     x(ldx,m) – double array
ldx, the first dimension of the array, must satisfy the constraint ldxn$\mathit{ldx}\ge {\mathbf{n}}$.
x(i,j)${\mathbf{x}}\left(\mathit{i},\mathit{j}\right)$ must contain the i$\mathit{i}$th observation for the j$\mathit{j}$th independent variable, for i = 1,2,,n$\mathit{i}=1,2,\dots ,{\mathbf{n}}$ and j = 1,2,,m$\mathit{j}=1,2,\dots ,{\mathbf{m}}$.
3:     vname(m) – cell array of strings
m, the dimension of the array, must satisfy the constraint m2${\mathbf{m}}\ge 2$.
vname(j)${\mathbf{vname}}\left(\mathit{j}\right)$ must contain the name of the variable in column j$\mathit{j}$ of x, for j = 1,2,,m$\mathit{j}=1,2,\dots ,{\mathbf{m}}$.
4:     isx(m) – int64int32nag_int array
m, the dimension of the array, must satisfy the constraint m2${\mathbf{m}}\ge 2$.
Indicates which independent variables are to be considered in the model.
isx(j)2${\mathbf{isx}}\left(j\right)\ge 2$
The variable contained in the j$j$th column of x is included in all regression models, i.e., is a forced variable.
isx(j) = 1${\mathbf{isx}}\left(j\right)=1$
The variable contained in the j$j$th column of x is included in the set from which the regression models are chosen, i.e., is a free variable.
isx(j) = 0${\mathbf{isx}}\left(j\right)=0$
The variable contained in the j$j$th column of x is not included in the models.
Constraints:
• isx(j)0${\mathbf{isx}}\left(\mathit{j}\right)\ge 0$, for j = 1,2,,m$\mathit{j}=1,2,\dots ,{\mathbf{m}}$;
• at least one value of isx = 1${\mathbf{isx}}=1$.
5:     y(n) – double array
n, the dimension of the array, must satisfy the constraint
• n2${\mathbf{n}}\ge 2$
• nm${\mathbf{n}}\ge m$, is the number of independent variables to be considered (forced plus free plus mean if included), as specified by mean and isx
• .
y(i)${\mathbf{y}}\left(\mathit{i}\right)$ must contain the i$\mathit{i}$th observation on the dependent variable, yi${y}_{\mathit{i}}$, for i = 1,2,,n$\mathit{i}=1,2,\dots ,n$.

### Optional Input Parameters

1:     n – int64int32nag_int scalar
Default: The dimension of the array y and the first dimension of the array x. (An error is raised if these dimensions are not equal.)
n$n$, the number of observations.
Constraints:
• n2${\mathbf{n}}\ge 2$;
• nm${\mathbf{n}}\ge m$, is the number of independent variables to be considered (forced plus free plus mean if included), as specified by mean and isx.
2:     m – int64int32nag_int scalar
Default: The dimension of the arrays vname, isx and the second dimension of the array x. (An error is raised if these dimensions are not equal.)
The number of variables contained in x.
Constraint: m2${\mathbf{m}}\ge 2$.
3:     wt( : $:$) – double array
Note: the dimension of the array wt must be at least n${\mathbf{n}}$ if weight = 'W'$\mathit{weight}=\text{'W'}$.
If weight = 'W'$\mathit{weight}=\text{'W'}$, wt must contain the weights to be used in the weighted regression.
If wt(i) = 0.0${\mathbf{wt}}\left(i\right)=0.0$, the i$i$th observation is not included in the model, in which case the effective number of observations is the number of observations with nonzero weights.
If weight = 'U'$\mathit{weight}=\text{'U'}$, wt is not referenced and the effective number of observations is n.
Constraint: if weight = 'W'$\mathit{weight}=\text{'W'}$, wt(i)0.0${\mathbf{wt}}\left(\mathit{i}\right)\ge 0.0$, for i = 1,2,,n$\mathit{i}=1,2,\dots ,n$.

### Input Parameters Omitted from the MATLAB Interface

weight ldx ldmodl wk

### Output Parameters

1:     nmod – int64int32nag_int scalar
The total number of models for which residual sums of squares have been calculated.
2:     modl(ldmodl,m) – cell array of strings
ldmodl = max (2k,m)$\mathit{ldmodl}=\mathrm{max}\phantom{\rule{0.125em}{0ex}}\left({2}^{\mathit{k}},{\mathbf{m}}\right)$.
The first nterms(i)${\mathbf{nterms}}\left(i\right)$ elements of the i$i$th row of modl contain the names of the independent variables, as given in vname, that are included in the i$i$th model.
ldmodl = max (2k,m)$\mathit{ldmodl}=\mathrm{max}\phantom{\rule{0.125em}{0ex}}\left({2}^{\mathit{k}},{\mathbf{m}}\right)$.
rss(i)${\mathbf{rss}}\left(\mathit{i}\right)$ contains the residual sum of squares for the i$\mathit{i}$th model, for i = 1,2,,nmod$\mathit{i}=1,2,\dots ,{\mathbf{nmod}}$.
4:     nterms(ldmodl) – int64int32nag_int array
ldmodl = max (2k,m)$\mathit{ldmodl}=\mathrm{max}\phantom{\rule{0.125em}{0ex}}\left({2}^{\mathit{k}},{\mathbf{m}}\right)$.
nterms(i)${\mathbf{nterms}}\left(\mathit{i}\right)$ contains the number of independent variables in the i$\mathit{i}$th model, not including the mean if one is fitted, for i = 1,2,,nmod$\mathit{i}=1,2,\dots ,{\mathbf{nmod}}$.
5:     mrank(ldmodl) – int64int32nag_int array
ldmodl = max (2k,m)$\mathit{ldmodl}=\mathrm{max}\phantom{\rule{0.125em}{0ex}}\left({2}^{\mathit{k}},{\mathbf{m}}\right)$.
mrank(i)${\mathbf{mrank}}\left(i\right)$ contains the rank of the residual sum of squares for the i$i$th model.
6:     ifail – int64int32nag_int scalar
${\mathrm{ifail}}={\mathbf{0}}$ unless the function detects an error (see [Error Indicators and Warnings]).

## Error Indicators and Warnings

Errors or warnings detected by the function:
ifail = 1${\mathbf{ifail}}=1$
 On entry, n < 2${\mathbf{n}}<2$, or m < 2${\mathbf{m}}<2$, or ldx < n$\mathit{ldx}<{\mathbf{n}}$, or ldmodl < m$\mathit{ldmodl}<{\mathbf{m}}$, or mean ≠ 'M'${\mathbf{mean}}\ne \text{'M'}$ or 'Z'$\text{'Z'}$, or weight ≠ 'U'$\mathit{weight}\ne \text{'U'}$ or 'W'$\text{'W'}$.
ifail = 2${\mathbf{ifail}}=2$
 On entry, weight = 'W'$\mathit{weight}=\text{'W'}$ and a value of wt < 0.0${\mathbf{wt}}<0.0$.
ifail = 3${\mathbf{ifail}}=3$
 On entry, a value of isx < 0${\mathbf{isx}}<0$, or there are no free variables, i.e., no element of isx = 1${\mathbf{isx}}=1$.
ifail = 4${\mathbf{ifail}}=4$
On entry, ldmodl < $\mathit{ldmodl}<\text{}$ the number of possible models = 2k$\text{}={2}^{\mathit{k}}$, where k$\mathit{k}$ is the number of free independent variables from isx.
ifail = 5${\mathbf{ifail}}=5$
On entry, the number of independent variables to be considered (forced plus free plus mean if included) is greater or equal to the effective number of observations.
ifail = 6${\mathbf{ifail}}=6$
The full model is not of full rank, i.e., some of the independent variables may be linear combinations of other independent variables. Variables must be excluded from the model in order to give full rank.

## Accuracy

For a discussion of the improved accuracy obtained by using a method based on the QR$QR$ decomposition see Smith and Bremner (1989).

nag_correg_linregm_rssq_stat (g02ec) may be used to compute R2${R}^{2}$ and Cp${C}_{p}$-values from the results of nag_correg_linregm_rssq (g02ea).
If a mean has been included in the model and no variables are forced in then rss(1)${\mathbf{rss}}\left(1\right)$ contains the total sum of squares and in many situations a reasonable estimate of the variance of the errors is given by rss(nmod) / (n1)${\mathbf{rss}}\left({\mathbf{nmod}}\right)/\left({\mathbf{n}}-1-{\mathbf{nterms}}\left({\mathbf{nmod}}\right)\right)$.

## Example

```function nag_correg_linregm_rssq_example
mean_p = 'M';
x = [0, 1125, 232, 7160, 85.9, 8905;
7, 920, 268, 8804, 86.5, 7388;
15, 835, 271, 8108, 85.2, 5348;
22, 1000, 237, 6370, 83.8, 8056;
29, 1150, 192, 6441, 82.1, 6960;
37, 990, 202, 5154, 79.2, 5690;
44, 840, 184, 5896, 81.2, 6932;
58, 650, 200, 5336, 80.6, 5400;
65, 640, 180, 5041, 78.4, 3177;
72, 583, 165, 5012, 79.3, 4461;
80, 570, 151, 4825, 78.7, 3901;
86, 570, 171, 4391, 78, 5002;
93, 510, 243, 4320, 72.3, 4665;
100, 555, 147, 3709, 74.9, 4642;
107, 460, 286, 3969, 74.4, 4840;
122, 275, 198, 3558, 72.5, 4479;
129, 510, 196, 4361, 57.7, 4200;
151, 165, 210, 3301, 71.8, 3410;
171, 244, 327, 2964, 72.5, 3360;
220, 79, 334, 2777, 71.9, 2599];
vname = {'DAY'; 'BOD'; 'TKN'; 'TS '; 'TVS'; 'COD'};
isx = [int64(0);1;1;1;1;1];
y = [1.5563;
0.8976;
0.7482;
0.716;
0.301;
0.3617;
0.1139;
0.1139;
-0.2218;
-0.1549;
0;
0;
-0.0969;
-0.2218;
-0.3979;
-0.1549;
-0.2218;
-0.3979;
-0.5229;
-0.0458];
[nmod, model, rss, nterms, mrank, ifail] = nag_correg_linregm_rssq(mean_p, x, vname, isx, y)
```
```

nmod =

32

model =

''       ''       ''       ''       ''       ''
'TKN'    ''       ''       ''       ''       ''
'TVS'    ''       ''       ''       ''       ''
'BOD'    ''       ''       ''       ''       ''
'COD'    ''       ''       ''       ''       ''
'TS '    ''       ''       ''       ''       ''
'TKN'    'TVS'    ''       ''       ''       ''
'BOD'    'TVS'    ''       ''       ''       ''
'BOD'    'TKN'    ''       ''       ''       ''
'BOD'    'COD'    ''       ''       ''       ''
'TKN'    'TS '    ''       ''       ''       ''
'TS '    'TVS'    ''       ''       ''       ''
'BOD'    'TS '    ''       ''       ''       ''
'TKN'    'COD'    ''       ''       ''       ''
'TVS'    'COD'    ''       ''       ''       ''
'TS '    'COD'    ''       ''       ''       ''
'BOD'    'TKN'    'TVS'    ''       ''       ''
'TKN'    'TS '    'TVS'    ''       ''       ''
'BOD'    'TS '    'TVS'    ''       ''       ''
'BOD'    'TVS'    'COD'    ''       ''       ''
'BOD'    'TKN'    'COD'    ''       ''       ''
'BOD'    'TKN'    'TS '    ''       ''       ''
'TKN'    'TVS'    'COD'    ''       ''       ''
'BOD'    'TS '    'COD'    ''       ''       ''
'TS '    'TVS'    'COD'    ''       ''       ''
'TKN'    'TS '    'COD'    ''       ''       ''
'BOD'    'TKN'    'TS '    'TVS'    ''       ''
'BOD'    'TKN'    'TVS'    'COD'    ''       ''
'BOD'    'TS '    'TVS'    'COD'    ''       ''
'BOD'    'TKN'    'TS '    'COD'    ''       ''
'TKN'    'TS '    'TVS'    'COD'    ''       ''
'BOD'    'TKN'    'TS '    'TVS'    'COD'    ''

5.0634
5.0219
2.5044
2.0338
1.5563
1.5370
2.4381
1.7462
1.5921
1.4963
1.4707
1.4590
1.4397
1.4388
1.3287
1.0850
1.4257
1.3900
1.3894
1.3204
1.2764
1.2582
1.2179
1.0644
1.0634
0.9871
1.2199
1.1565
1.0388
0.9871
0.9653
0.9652

nterms =

0
1
1
1
1
1
2
2
2
2
2
2
2
2
2
2
3
3
3
3
3
3
3
3
3
3
4
4
4
4
4
5

mrank =

32
31
30
28
25
24
29
27
26
23
22
21
20
19
15
8
18
17
16
14
13
12
10
7
6
4
11
9
5
3
2
1

ifail =

0

```
```function g02ea_example
mean_p = 'M';
x = [0, 1125, 232, 7160, 85.9, 8905;
7, 920, 268, 8804, 86.5, 7388;
15, 835, 271, 8108, 85.2, 5348;
22, 1000, 237, 6370, 83.8, 8056;
29, 1150, 192, 6441, 82.1, 6960;
37, 990, 202, 5154, 79.2, 5690;
44, 840, 184, 5896, 81.2, 6932;
58, 650, 200, 5336, 80.6, 5400;
65, 640, 180, 5041, 78.4, 3177;
72, 583, 165, 5012, 79.3, 4461;
80, 570, 151, 4825, 78.7, 3901;
86, 570, 171, 4391, 78, 5002;
93, 510, 243, 4320, 72.3, 4665;
100, 555, 147, 3709, 74.9, 4642;
107, 460, 286, 3969, 74.4, 4840;
122, 275, 198, 3558, 72.5, 4479;
129, 510, 196, 4361, 57.7, 4200;
151, 165, 210, 3301, 71.8, 3410;
171, 244, 327, 2964, 72.5, 3360;
220, 79, 334, 2777, 71.9, 2599];
vname = {'DAY'; 'BOD'; 'TKN'; 'TS '; 'TVS'; 'COD'};
isx = [int64(0);1;1;1;1;1];
y = [1.5563;
0.8976;
0.7482;
0.716;
0.301;
0.3617;
0.1139;
0.1139;
-0.2218;
-0.1549;
0;
0;
-0.0969;
-0.2218;
-0.3979;
-0.1549;
-0.2218;
-0.3979;
-0.5229;
-0.0458];
[nmod, model, rss, nterms, mrank, ifail] = g02ea(mean_p, x, vname, isx, y)
```
```

nmod =

32

model =

''       ''       ''       ''       ''       ''
'TKN'    ''       ''       ''       ''       ''
'TVS'    ''       ''       ''       ''       ''
'BOD'    ''       ''       ''       ''       ''
'COD'    ''       ''       ''       ''       ''
'TS '    ''       ''       ''       ''       ''
'TKN'    'TVS'    ''       ''       ''       ''
'BOD'    'TVS'    ''       ''       ''       ''
'BOD'    'TKN'    ''       ''       ''       ''
'BOD'    'COD'    ''       ''       ''       ''
'TKN'    'TS '    ''       ''       ''       ''
'TS '    'TVS'    ''       ''       ''       ''
'BOD'    'TS '    ''       ''       ''       ''
'TKN'    'COD'    ''       ''       ''       ''
'TVS'    'COD'    ''       ''       ''       ''
'TS '    'COD'    ''       ''       ''       ''
'BOD'    'TKN'    'TVS'    ''       ''       ''
'TKN'    'TS '    'TVS'    ''       ''       ''
'BOD'    'TS '    'TVS'    ''       ''       ''
'BOD'    'TVS'    'COD'    ''       ''       ''
'BOD'    'TKN'    'COD'    ''       ''       ''
'BOD'    'TKN'    'TS '    ''       ''       ''
'TKN'    'TVS'    'COD'    ''       ''       ''
'BOD'    'TS '    'COD'    ''       ''       ''
'TS '    'TVS'    'COD'    ''       ''       ''
'TKN'    'TS '    'COD'    ''       ''       ''
'BOD'    'TKN'    'TS '    'TVS'    ''       ''
'BOD'    'TKN'    'TVS'    'COD'    ''       ''
'BOD'    'TS '    'TVS'    'COD'    ''       ''
'BOD'    'TKN'    'TS '    'COD'    ''       ''
'TKN'    'TS '    'TVS'    'COD'    ''       ''
'BOD'    'TKN'    'TS '    'TVS'    'COD'    ''

5.0634
5.0219
2.5044
2.0338
1.5563
1.5370
2.4381
1.7462
1.5921
1.4963
1.4707
1.4590
1.4397
1.4388
1.3287
1.0850
1.4257
1.3900
1.3894
1.3204
1.2764
1.2582
1.2179
1.0644
1.0634
0.9871
1.2199
1.1565
1.0388
0.9871
0.9653
0.9652

nterms =

0
1
1
1
1
1
2
2
2
2
2
2
2
2
2
2
3
3
3
3
3
3
3
3
3
3
4
4
4
4
4
5

mrank =

32
31
30
28
25
24
29
27
26
23
22
21
20
19
15
8
18
17
16
14
13
12
10
7
6
4
11
9
5
3
2
1

ifail =

0

```