G05PVF (PDF version)
G05 Chapter Contents
G05 Chapter Introduction
NAG Library Manual

NAG Library Routine Document

G05PVF

Note:  before using this routine, please read the Users' Note for your implementation to check the interpretation of bold italicised terms and other implementation-dependent details.

 Contents

    1  Purpose
    7  Accuracy

1  Purpose

G05PVF generates training and validation datasets suitable for use in cross-validation or jack-knifing.

2  Specification

SUBROUTINE G05PVF ( K, FOLD, N, M, SORDX, X, LDX, USEY, Y, USEW, W, NT, STATE, IFAIL)
INTEGER  K, FOLD, N, M, SORDX, LDX, USEY, USEW, NT, STATE(*), IFAIL
REAL (KIND=nag_wp)  X(LDX,*), Y(*), W(*)

3  Description

Let Xo denote a matrix of n observations on m variables and yo and wo each denote a vector of length n. For example, Xo might represent a matrix of independent variables, yo the dependent variable and wo the associated weights in a weighted regression.
G05PVF generates a series of training datasets, denoted by the matrix, vector, vector triplet Xt,yt,wt of nt observations, and validation datasets, denoted Xv,yv,wv with nv observations. These training and validation datasets are generated as follows.
Each of the original n observations is randomly assigned to one of K equally sized groups or folds. For the kth sample the validation dataset consists of those observations in group k and the training dataset consists of all those observations not in group k. Therefore at most K samples can be generated.
If n is not divisible by K then the observations are assigned to groups as evenly as possible, therefore any group will be at most one observation larger or smaller than any other group.
When using K=n the resulting datasets are suitable for leave-one-out cross-validation, or the training dataset on its own for jack-knifing. When using Kn the resulting datasets are suitable for K-fold cross-validation. Datasets suitable for reversed cross-validation can be obtained by switching the training and validation datasets, i.e., use the kth group as the training dataset and the rest of the data as the validation dataset.
One of the initialization routines G05KFF (for a repeatable sequence if computed sequentially) or G05KGF (for a non-repeatable sequence) must be called prior to the first call to G05PVF.

4  References

None.

5  Parameters

1:     K – INTEGERInput
On entry: K, the number of folds.
Constraint: 2KN.
2:     FOLD – INTEGERInput
On entry: the number of the fold to return as the validation dataset.
On the first call to G05PVF FOLD should be set to 1 and then incremented by one at each subsequent call until all K sets of training and validation datasets have been produced. See Section 9 for more details on how a different calling sequence can be used.
Constraint: 1FOLDK.
3:     N – INTEGERInput
On entry: n, the number of observations.
Constraint: N1.
4:     M – INTEGERInput
On entry: m, the number of variables.
Constraint: M1.
5:     SORDX – INTEGERInput
On entry: determines how variables are stored in X.
Constraint: SORDX=1 or 2.
6:     XLDX* – REAL (KIND=nag_wp) arrayInput/Output
Note: the second dimension of the array X must be at least M if SORDX=1 and at least N if SORDX=2.
The way the data is stored in X is defined by SORDX.
If SORDX=1, Xij contains the ith observation for the jth variable, for i=1,2,,N and j=1,2,,M.
If SORDX=2, Xji contains the ith observation for the jth variable, for i=1,2,,N and j=1,2,,M.
On entry: if FOLD=1, X must hold Xo, the values of X for the original dataset, otherwise, X must not be changed since the last call to G05PVF.
On exit: values of X for the training and validation datasets, with Xt held in observations 1 to NT and Xv in observations NT+1 to N.
7:     LDX – INTEGERInput
On entry: the first dimension of the array X as declared in the (sub)program from which G05PVF is called.
Constraints:
  • if SORDX=2, LDXM;
  • otherwise LDXN.
8:     USEY – INTEGERInput
On entry: if USEY=1, the original dataset includes yo and yo will be processed alongside Xo.
Constraint: USEY=0 or 1.
9:     Y* – REAL (KIND=nag_wp) arrayInput/Output
Note: the dimension of the array Y must be at least N if USEY=1.
If USEY=0, Y is not referenced on entry and will not be modified on exit.
On entry: if FOLD=1, Y must hold yo, the values of y for the original dataset, otherwise Y must not be changed since the last call to G05PVF.
On exit: values of y for the training and validation datasets, with yt held in elements 1 to NT and yv in elements NT+1 to N.
10:   USEW – INTEGERInput
On entry: if USEW=1, the original dataset includes wo and wo will be processed alongside Xo.
Constraint: USEW=0 or 1.
11:   W* – REAL (KIND=nag_wp) arrayInput/Output
Note: the dimension of the array W must be at least N if USEW=1.
If USEW=0, W is not referenced on entry and will not be modified on exit.
On entry: if FOLD=1, W must hold wo, the values of w for the original dataset, otherwise W must not be changed since the last call to G05PVF.
On exit: values of w for the training and validation datasets, with wt held in elements 1 to NT and wv in elements NT+1 to N.
12:   NT – INTEGEROutput
On exit: nt, the number of observations in the training dataset.
13:   STATE* – INTEGER arrayCommunication Array
Note: the actual argument supplied must be the array STATE supplied to the initialization routines G05KFF or G05KGF.
On entry: contains information on the selected base generator and its current state.
On exit: contains updated information on the state of the generator.
14:   IFAIL – INTEGERInput/Output
On entry: IFAIL must be set to 0, -1​ or ​1. If you are unfamiliar with this parameter you should refer to Section 3.3 in the Essential Introduction for details.
For environments where it might be inappropriate to halt program execution when an error is detected, the value -1​ or ​1 is recommended. If the output of error messages is undesirable, then the value 1 is recommended. Otherwise, because for this routine the values of the output parameters may be useful even if IFAIL0 on exit, the recommended value is -1. When the value -1​ or ​1 is used it is essential to test the value of IFAIL on exit.
On exit: IFAIL=0 unless the routine detects an error or a warning has been flagged (see Section 6).

6  Error Indicators and Warnings

If on entry IFAIL=0 or -1, explanatory error messages are output on the current error message unit (as defined by X04AAF).
Note: G05PVF may return useful information for one or more of the following detected errors or warnings.
Errors or warnings detected by the routine:
IFAIL=11
On entry, K=value and N=value.
Constraint: 2KN.
IFAIL=21
On entry, FOLD=value and K=value.
Constraint: 1FOLDK.
IFAIL=31
On entry, N=value.
Constraint: N1.
IFAIL=41
On entry, M=value.
Constraint: M1.
IFAIL=51
On entry, SORDX=value.
Constraint: SORDX=1 or 2.
IFAIL=61
More than 50% of the data did not move when the data was shuffled. value of the value observations stayed put.
IFAIL=71
On entry, LDX=value and N=value.
Constraint: if SORDX=1, LDXN.
IFAIL=72
On entry, LDX=value and M=value.
Constraint: if SORDX=2, LDXM.
IFAIL=81
Constraint: USEY=0 or 1.
IFAIL=101
Constraint: USEW=0 or 1.
IFAIL=131
On entry, STATE vector has been corrupted or not initialized.
IFAIL=-99
An unexpected error has been triggered by this routine. Please contact NAG.
See Section 3.8 in the Essential Introduction for further information.
IFAIL=-399
Your licence key may have expired or may not have been installed correctly.
See Section 3.7 in the Essential Introduction for further information.
IFAIL=-999
Dynamic memory allocation failed.
See Section 3.6 in the Essential Introduction for further information.

7  Accuracy

Not applicable.

8  Parallelism and Performance

G05PVF is threaded by NAG for parallel execution in multithreaded implementations of the NAG Library.
G05PVF makes calls to BLAS and/or LAPACK routines, which may be threaded within the vendor library used by this implementation. Consult the documentation for the vendor library for further information.
Please consult the X06 Chapter Introduction for information on how to control and interrogate the OpenMP environment used within this routine. Please also consult the Users' Note for your implementation for any additional implementation-specific information.

9  Further Comments

G05PVF will be computationality more efficient if each observation in X is contiguous, that is SORDX=2.
Because of the way G05PVF stores the data you should usually generate the K training and validation datasets in order, i.e., set FOLD=1 on the first call and increment it by one at each subsequent call. However, there are times when a different calling sequence would be beneficial, for example, when performing different cross-validation analyses on different threads. This is possible, as long as the following is borne in mind:
For example, if you have three threads, you would call G05PVF once with FOLD=1. You would then copy the X returned onto each thread and generate the remaing K-1 sets of data by splitting them between the threads. For example, the first thread runs with FOLD=2,,L1, the second with FOLD=L1+1,,L2 and the third with FOLD=L2+1,,K.

10  Example

This example uses G05PVF to facilitate K-fold cross-validation.
A set of simulated data is split into 5 training and validation datasets. G02GBF is used to fit a logistic regression model to each training dataset and then G02GPF is used to predict the response for the observations in the validation dataset.
The counts of true and false positives and negatives along with the sensitivity and specificity is then reported.

10.1  Program Text

Program Text (g05pvfe.f90)

10.2  Program Data

Program Data (g05pvfe.d)

10.3  Program Results

Program Results (g05pvfe.r)


G05PVF (PDF version)
G05 Chapter Contents
G05 Chapter Introduction
NAG Library Manual

© The Numerical Algorithms Group Ltd, Oxford, UK. 2015