NAG Library Routine Document
g05pvf (kfold_xyw)
1
Purpose
g05pvf generates training and validation datasets suitable for use in crossvalidation or jackknifing.
2
Specification
Fortran Interface
Subroutine g05pvf ( 
k, fold, n, m, sordx, x, ldx, usey, y, usew, w, nt, state, ifail) 
Integer, Intent (In)  ::  k, fold, n, m, sordx, ldx, usey, usew  Integer, Intent (Inout)  ::  state(*), ifail  Integer, Intent (Out)  ::  nt  Real (Kind=nag_wp), Intent (Inout)  ::  x(ldx,*), y(*), w(*) 

C Header Interface
#include nagmk26.h
void 
g05pvf_ (const Integer *k, const Integer *fold, const Integer *n, const Integer *m, const Integer *sordx, double x[], const Integer *ldx, const Integer *usey, double y[], const Integer *usew, double w[], Integer *nt, Integer state[], Integer *ifail) 

3
Description
Let ${X}_{o}$ denote a matrix of $n$ observations on $m$ variables and ${y}_{o}$ and ${w}_{o}$ each denote a vector of length $n$. For example, ${X}_{o}$ might represent a matrix of independent variables, ${y}_{o}$ the dependent variable and ${w}_{o}$ the associated weights in a weighted regression.
g05pvf generates a series of training datasets, denoted by the matrix, vector, vector triplet $\left({X}_{t},{y}_{t},{w}_{t}\right)$ of ${n}_{t}$ observations, and validation datasets, denoted $\left({X}_{v},{y}_{v},{w}_{v}\right)$ with ${n}_{v}$ observations. These training and validation datasets are generated as follows.
Each of the original $n$ observations is randomly assigned to one of $K$ equally sized groups or folds. For the $k$th sample the validation dataset consists of those observations in group $k$ and the training dataset consists of all those observations not in group $k$. Therefore at most $K$ samples can be generated.
If $n$ is not divisible by $K$ then the observations are assigned to groups as evenly as possible, therefore any group will be at most one observation larger or smaller than any other group.
When using $K=n$ the resulting datasets are suitable for leaveoneout crossvalidation, or the training dataset on its own for jackknifing. When using $K\ne n$ the resulting datasets are suitable for $K$fold crossvalidation. Datasets suitable for reversed crossvalidation can be obtained by switching the training and validation datasets, i.e., use the $k$th group as the training dataset and the rest of the data as the validation dataset.
One of the initialization routines
g05kff (for a repeatable sequence if computed sequentially) or
g05kgf (for a nonrepeatable sequence) must be called prior to the first call to
g05pvf.
4
References
None.
5
Arguments
 1: $\mathbf{k}$ – IntegerInput

On entry: $K$, the number of folds.
Constraint:
$2\le {\mathbf{k}}\le {\mathbf{n}}$.
 2: $\mathbf{fold}$ – IntegerInput

On entry: the number of the fold to return as the validation dataset.
On the first call to
g05pvf ${\mathbf{fold}}$ should be set to
$1$ and then incremented by one at each subsequent call until all
$K$ sets of training and validation datasets have been produced. See
Section 9 for more details on how a different calling sequence can be used.
Constraint:
$1\le {\mathbf{fold}}\le {\mathbf{k}}$.
 3: $\mathbf{n}$ – IntegerInput

On entry: $n$, the number of observations.
Constraint:
${\mathbf{n}}\ge 1$.
 4: $\mathbf{m}$ – IntegerInput

On entry: $m$, the number of variables.
Constraint:
${\mathbf{m}}\ge 1$.
 5: $\mathbf{sordx}$ – IntegerInput

On entry: determines how variables are stored in
x.
Constraint:
${\mathbf{sordx}}=1$ or $2$.
 6: $\mathbf{x}\left({\mathbf{ldx}},*\right)$ – Real (Kind=nag_wp) arrayInput/Output

Note: the second dimension of the array
x
must be at least
${\mathbf{m}}$ if
${\mathbf{sordx}}=1$ and at least
${\mathbf{n}}$ if
${\mathbf{sordx}}=2$.
The way the data is stored in
x is defined by
sordx.
If ${\mathbf{sordx}}=1$, ${\mathbf{x}}\left(\mathit{i},\mathit{j}\right)$ contains the $\mathit{i}$th observation for the $\mathit{j}$th variable, for $i=1,2,\dots ,{\mathbf{n}}$ and $j=1,2,\dots ,{\mathbf{m}}$.
If ${\mathbf{sordx}}=2$, ${\mathbf{x}}\left(\mathit{j},\mathit{i}\right)$ contains the $\mathit{i}$th observation for the $\mathit{j}$th variable, for $i=1,2,\dots ,{\mathbf{n}}$ and $j=1,2,\dots ,{\mathbf{m}}$.
On entry: if
${\mathbf{fold}}=1$,
x must hold
${X}_{o}$, the values of
$X$ for the original dataset, otherwise,
x must not be changed since the last call to
g05pvf.
On exit: values of $X$ for the training and validation datasets, with ${X}_{t}$ held in observations $1$ to ${\mathbf{nt}}$ and ${X}_{v}$ in observations ${\mathbf{nt}}+1$ to ${\mathbf{n}}$.
 7: $\mathbf{ldx}$ – IntegerInput

On entry: the first dimension of the array
x as declared in the (sub)program from which
g05pvf is called.
Constraints:
 if ${\mathbf{sordx}}=2$, ${\mathbf{ldx}}\ge {\mathbf{m}}$;
 otherwise ${\mathbf{ldx}}\ge {\mathbf{n}}$.
 8: $\mathbf{usey}$ – IntegerInput

On entry: if ${\mathbf{usey}}=1$, the original dataset includes ${y}_{o}$ and ${y}_{o}$ will be processed alongside ${X}_{o}$.
Constraint:
${\mathbf{usey}}=0$ or $1$.
 9: $\mathbf{y}\left(*\right)$ – Real (Kind=nag_wp) arrayInput/Output

Note: the dimension of the array
y
must be at least
${\mathbf{n}}$ if
${\mathbf{usey}}=1$.
If
${\mathbf{usey}}=0$,
y is not referenced on entry and will not be modified on exit.
On entry: if
${\mathbf{fold}}=1$,
y must hold
${y}_{o}$, the values of
$y$ for the original dataset, otherwise
y must not be changed since the last call to
g05pvf.
On exit: values of $y$ for the training and validation datasets, with ${y}_{t}$ held in elements $1$ to ${\mathbf{nt}}$ and ${y}_{v}$ in elements ${\mathbf{nt}}+1$ to ${\mathbf{n}}$.
 10: $\mathbf{usew}$ – IntegerInput

On entry: if ${\mathbf{usew}}=1$, the original dataset includes ${w}_{o}$ and ${w}_{o}$ will be processed alongside ${X}_{o}$.
Constraint:
${\mathbf{usew}}=0$ or $1$.
 11: $\mathbf{w}\left(*\right)$ – Real (Kind=nag_wp) arrayInput/Output

Note: the dimension of the array
w
must be at least
${\mathbf{n}}$ if
${\mathbf{usew}}=1$.
If
${\mathbf{usew}}=0$,
w is not referenced on entry and will not be modified on exit.
On entry: if
${\mathbf{fold}}=1$,
w must hold
${w}_{o}$, the values of
$w$ for the original dataset, otherwise
w must not be changed since the last call to
g05pvf.
On exit: values of $w$ for the training and validation datasets, with ${w}_{t}$ held in elements $1$ to ${\mathbf{nt}}$ and ${w}_{v}$ in elements ${\mathbf{nt}}+1$ to ${\mathbf{n}}$.
 12: $\mathbf{nt}$ – IntegerOutput

On exit: ${n}_{t}$, the number of observations in the training dataset.
 13: $\mathbf{state}\left(*\right)$ – Integer arrayCommunication Array

Note: the actual argument supplied
must be the array
state supplied to the initialization routines
g05kff or
g05kgf.
On entry: contains information on the selected base generator and its current state.
On exit: contains updated information on the state of the generator.
 14: $\mathbf{ifail}$ – IntegerInput/Output

On entry:
ifail must be set to
$0$,
$1\text{ or}1$. If you are unfamiliar with this argument you should refer to
Section 3.4 in How to Use the NAG Library and its Documentation for details.
For environments where it might be inappropriate to halt program execution when an error is detected, the value
$1\text{ or}1$ is recommended. If the output of error messages is undesirable, then the value
$1$ is recommended. Otherwise, because for this routine the values of the output arguments may be useful even if
${\mathbf{ifail}}\ne {\mathbf{0}}$ on exit, the recommended value is
$1$.
When the value $\mathbf{1}\text{ or}1$ is used it is essential to test the value of ifail on exit.
On exit:
${\mathbf{ifail}}={\mathbf{0}}$ unless the routine detects an error or a warning has been flagged (see
Section 6).
6
Error Indicators and Warnings
If on entry
${\mathbf{ifail}}=0$ or
$1$, explanatory error messages are output on the current error message unit (as defined by
x04aaf).
Note: g05pvf may return useful information for one or more of the following detected errors or warnings.
Errors or warnings detected by the routine:
 ${\mathbf{ifail}}=11$

On entry, ${\mathbf{k}}=\u2329\mathit{\text{value}}\u232a$ and ${\mathbf{n}}=\u2329\mathit{\text{value}}\u232a$.
Constraint: $2\le {\mathbf{k}}\le {\mathbf{n}}$.
 ${\mathbf{ifail}}=21$

On entry, ${\mathbf{fold}}=\u2329\mathit{\text{value}}\u232a$ and ${\mathbf{k}}=\u2329\mathit{\text{value}}\u232a$.
Constraint: $1\le {\mathbf{fold}}\le {\mathbf{k}}$.
 ${\mathbf{ifail}}=31$

On entry, ${\mathbf{n}}=\u2329\mathit{\text{value}}\u232a$.
Constraint: ${\mathbf{n}}\ge 1$.
 ${\mathbf{ifail}}=41$

On entry, ${\mathbf{m}}=\u2329\mathit{\text{value}}\u232a$.
Constraint: ${\mathbf{m}}\ge 1$.
 ${\mathbf{ifail}}=51$

On entry, ${\mathbf{sordx}}=\u2329\mathit{\text{value}}\u232a$.
Constraint: ${\mathbf{sordx}}=1$ or $2$.
 ${\mathbf{ifail}}=61$

More than $50\%$ of the data did not move when the data was shuffled. $\u2329\mathit{\text{value}}\u232a$ of the $\u2329\mathit{\text{value}}\u232a$ observations stayed put.
 ${\mathbf{ifail}}=71$

On entry, ${\mathbf{ldx}}=\u2329\mathit{\text{value}}\u232a$ and ${\mathbf{n}}=\u2329\mathit{\text{value}}\u232a$.
Constraint: if ${\mathbf{sordx}}=1$, ${\mathbf{ldx}}\ge {\mathbf{n}}$.
 ${\mathbf{ifail}}=72$

On entry, ${\mathbf{ldx}}=\u2329\mathit{\text{value}}\u232a$ and ${\mathbf{m}}=\u2329\mathit{\text{value}}\u232a$.
Constraint: if ${\mathbf{sordx}}=2$, ${\mathbf{ldx}}\ge {\mathbf{m}}$.
 ${\mathbf{ifail}}=81$

Constraint: ${\mathbf{usey}}=0$ or $1$.
 ${\mathbf{ifail}}=101$

Constraint: ${\mathbf{usew}}=0$ or $1$.
 ${\mathbf{ifail}}=131$

On entry,
state vector has been corrupted or not initialized.
 ${\mathbf{ifail}}=99$
An unexpected error has been triggered by this routine. Please
contact
NAG.
See
Section 3.9 in How to Use the NAG Library and its Documentation for further information.
 ${\mathbf{ifail}}=399$
Your licence key may have expired or may not have been installed correctly.
See
Section 3.8 in How to Use the NAG Library and its Documentation for further information.
 ${\mathbf{ifail}}=999$
Dynamic memory allocation failed.
See
Section 3.7 in How to Use the NAG Library and its Documentation for further information.
7
Accuracy
Not applicable.
8
Parallelism and Performance
g05pvf is threaded by NAG for parallel execution in multithreaded implementations of the NAG Library.
g05pvf makes calls to BLAS and/or LAPACK routines, which may be threaded within the vendor library used by this implementation. Consult the documentation for the vendor library for further information.
Please consult the
X06 Chapter Introduction for information on how to control and interrogate the OpenMP environment used within this routine. Please also consult the
Users' Note for your implementation for any additional implementationspecific information.
g05pvf will be computationality more efficient if each observation in
x is contiguous, that is
${\mathbf{sordx}}=2$.
Because of the way
g05pvf stores the data you should usually generate the
$K$ training and validation datasets in order, i.e., set
${\mathbf{fold}}=1$ on the first call and increment it by one at each subsequent call. However, there are times when a different calling sequence would be beneficial, for example, when performing different crossvalidation analyses on different threads. This is possible, as long as the following is borne in mind:
 g05pvf must be called with ${\mathbf{fold}}=1$ first.
 Other than the first set, you can obtain the training and validation dataset in any order, but for a given x you can only obtain each once.
For example, if you have three threads, you would call
g05pvf once with
${\mathbf{fold}}=1$. You would then copy the
x returned onto each thread and generate the remaing
${\mathbf{k}}1$ sets of data by splitting them between the threads. For example, the first thread runs with
${\mathbf{fold}}=2,\dots ,{L}_{1}$, the second with
${\mathbf{fold}}={L}_{1}+1,\dots ,{L}_{2}$ and the third with
${\mathbf{fold}}={L}_{2}+1,\dots ,{\mathbf{k}}$.
10
Example
This example uses g05pvf to facilitate $K$fold crossvalidation.
A set of simulated data is split into
$5$ training and validation datasets.
g02gbf is used to fit a logistic regression model to each training dataset and then
g02gpf is used to predict the response for the observations in the validation dataset.
The counts of true and false positives and negatives along with the sensitivity and specificity is then reported.
10.1
Program Text
Program Text (g05pvfe.f90)
10.2
Program Data
Program Data (g05pvfe.d)
10.3
Program Results
Program Results (g05pvfe.r)