Integer type:  int32  int64  nag_int  show int32  show int32  show int64  show int64  show nag_int  show nag_int

Chapter Contents
Chapter Introduction
NAG Toolbox

# NAG Toolbox: nag_nonpar_test_ks_2sample (g08cd)

## Purpose

nag_nonpar_test_ks_2sample (g08cd) performs the two sample Kolmogorov–Smirnov distribution test.

## Syntax

[d, z, p, sx, sy, ifail] = g08cd(x, y, ntype, 'n1', n1, 'n2', n2)
[d, z, p, sx, sy, ifail] = nag_nonpar_test_ks_2sample(x, y, ntype, 'n1', n1, 'n2', n2)

## Description

The data consists of two independent samples, one of size ${n}_{1}$, denoted by ${x}_{1},{x}_{2},\dots ,{x}_{{n}_{1}}$, and the other of size ${n}_{2}$ denoted by ${y}_{1},{y}_{2},\dots ,{y}_{{n}_{2}}$. Let $F\left(x\right)$ and $G\left(x\right)$ represent their respective, unknown, distribution functions. Also let ${S}_{1}\left(x\right)$ and ${S}_{2}\left(x\right)$ denote the values of the sample cumulative distribution functions at the point $x$ for the two samples respectively.
The Kolmogorov–Smirnov test provides a test of the null hypothesis ${H}_{0}$: $F\left(x\right)=G\left(x\right)$ against one of the following alternative hypotheses:
 (i) ${H}_{1}$: $F\left(x\right)\ne G\left(x\right)$. (ii) ${H}_{2}$: $F\left(x\right)>G\left(x\right)$. This alternative hypothesis is sometimes stated as, ‘The $x$'s tend to be smaller than the $y$'s’, i.e., it would be demonstrated in practical terms if the values of ${S}_{1}\left(x\right)$ tended to exceed the corresponding values of ${S}_{2}\left(x\right)$. (iii) ${H}_{3}$: $F\left(x\right). This alternative hypothesis is sometimes stated as, ‘The $x$'s tend to be larger than the $y$'s’, i.e., it would be demonstrated in practical terms if the values of ${S}_{2}\left(x\right)$ tended to exceed the corresponding values of ${S}_{1}\left(x\right)$.
One of the following test statistics is computed depending on the particular alternative null hypothesis specified (see the description of the argument ntype in Arguments).
For the alternative hypothesis ${H}_{1}$.
• ${D}_{{n}_{1},{n}_{2}}$ – the largest absolute deviation between the two sample cumulative distribution functions.
For the alternative hypothesis ${H}_{2}$.
• ${D}_{{n}_{1},{n}_{2}}^{+}$ – the largest positive deviation between the sample cumulative distribution function of the first sample, ${S}_{1}\left(x\right)$, and the sample cumulative distribution function of the second sample, ${S}_{2}\left(x\right)$. Formally ${D}_{{n}_{1},{n}_{2}}^{+}=\mathrm{max}\phantom{\rule{0.125em}{0ex}}\left\{{S}_{1}\left(x\right)-{S}_{2}\left(x\right),0\right\}$.
For the alternative hypothesis ${H}_{3}$.
• ${D}_{{n}_{1},{n}_{2}}^{-}$ – the largest positive deviation between the sample cumulative distribution function of the second sample, ${S}_{2}\left(x\right)$, and the sample cumulative distribution function of the first sample, ${S}_{1}\left(x\right)$. Formally ${D}_{{n}_{1},{n}_{2}}^{-}=\mathrm{max}\phantom{\rule{0.125em}{0ex}}\left\{{S}_{2}\left(x\right)-{S}_{1}\left(x\right),0\right\}$.
nag_nonpar_test_ks_2sample (g08cd) also returns the standardized statistic $Z=\sqrt{\frac{{n}_{1}+{n}_{2}}{{n}_{1}{n}_{2}}}×D$, where $D$ may be ${D}_{{n}_{1},{n}_{2}}$, ${D}_{{n}_{1},{n}_{2}}^{+}$ or ${D}_{{n}_{1},{n}_{2}}^{-}$ depending on the choice of the alternative hypothesis. The distribution of this statistic converges asymptotically to a distribution given by Smirnov as ${n}_{1}$ and ${n}_{2}$ increase; see Feller (1948), Kendall and Stuart (1973), Kim and Jenrich (1973), Smirnov (1933) or Smirnov (1948)
The probability, under the null hypothesis, of obtaining a value of the test statistic as extreme as that observed, is computed. If $\mathrm{max}\phantom{\rule{0.125em}{0ex}}\left({n}_{1},{n}_{2}\right)\le 2500$ and ${n}_{1}{n}_{2}\le 10000$ then an exact method given by Kim and Jenrich (see Kim and Jenrich (1973)) is used. Otherwise $p$ is computed using the approximations suggested by Kim and Jenrich (1973). Note that the method used is only exact for continuous theoretical distributions. This method computes the two-sided probability. The one-sided probabilities are estimated by halving the two-sided probability. This is a good estimate for small $p$, that is $p\le 0.10$, but it becomes very poor for larger $p$.

## References

Conover W J (1980) Practical Nonparametric Statistics Wiley
Feller W (1948) On the Kolmogorov–Smirnov limit theorems for empirical distributions Ann. Math. Statist. 19 179–181
Kendall M G and Stuart A (1973) The Advanced Theory of Statistics (Volume 2) (3rd Edition) Griffin
Kim P J and Jenrich R I (1973) Tables of exact sampling distribution of the two sample Kolmogorov–Smirnov criterion ${D}_{mn}\left(m Selected Tables in Mathematical Statistics 1 80–129 American Mathematical Society
Siegel S (1956) Non-parametric Statistics for the Behavioral Sciences McGraw–Hill
Smirnov N (1933) Estimate of deviation between empirical distribution functions in two independent samples Bull. Moscow Univ. 2(2) 3–16
Smirnov N (1948) Table for estimating the goodness of fit of empirical distributions Ann. Math. Statist. 19 279–281

## Parameters

### Compulsory Input Parameters

1:     $\mathrm{x}\left({\mathbf{n1}}\right)$ – double array
The observations from the first sample, ${x}_{1},{x}_{2},\dots ,{x}_{{n}_{1}}$.
2:     $\mathrm{y}\left({\mathbf{n2}}\right)$ – double array
The observations from the second sample, ${y}_{1},{y}_{2},\dots ,{y}_{{n}_{2}}$.
3:     $\mathrm{ntype}$int64int32nag_int scalar
The statistic to be computed, i.e., the choice of alternative hypothesis.
${\mathbf{ntype}}=1$
Computes ${D}_{{n}_{1}{n}_{2}}$, to test against ${H}_{1}$.
${\mathbf{ntype}}=2$
Computes ${D}_{{n}_{1}{n}_{2}}^{+}$, to test against ${H}_{2}$.
${\mathbf{ntype}}=3$
Computes ${D}_{{n}_{1}{n}_{2}}^{-}$, to test against ${H}_{3}$.
Constraint: ${\mathbf{ntype}}=1$, $2$ or $3$.

### Optional Input Parameters

1:     $\mathrm{n1}$int64int32nag_int scalar
Default: the dimension of the array x.
The number of observations in the first sample, ${n}_{1}$.
Constraint: ${\mathbf{n1}}\ge 1$.
2:     $\mathrm{n2}$int64int32nag_int scalar
Default: the dimension of the array y.
The number of observations in the second sample, ${n}_{2}$.
Constraint: ${\mathbf{n2}}\ge 1$.

### Output Parameters

1:     $\mathrm{d}$ – double scalar
The Kolmogorov–Smirnov test statistic (${D}_{{n}_{1}{n}_{2}}$, ${D}_{{n}_{1}{n}_{2}}^{+}$ or ${D}_{{n}_{1}{n}_{2}}^{-}$ according to the value of ntype).
2:     $\mathrm{z}$ – double scalar
A standardized value, $Z$, of the test statistic, $D$, without any correction for continuity.
3:     $\mathrm{p}$ – double scalar
The tail probability associated with the observed value of $D$, where $D$ may be ${D}_{{n}_{1},{n}_{2}},{D}_{{n}_{1},{n}_{2}}^{+}$ or ${D}_{{n}_{1},{n}_{2}}^{-}$ depending on the value of ntype (see Description).
4:     $\mathrm{sx}\left({\mathbf{n1}}\right)$ – double array
The observations from the first sample sorted in ascending order.
5:     $\mathrm{sy}\left({\mathbf{n2}}\right)$ – double array
The observations from the second sample sorted in ascending order.
6:     $\mathrm{ifail}$int64int32nag_int scalar
${\mathbf{ifail}}={\mathbf{0}}$ unless the function detects an error (see Error Indicators and Warnings).

## Error Indicators and Warnings

Errors or warnings detected by the function:
${\mathbf{ifail}}=1$
 On entry, ${\mathbf{n1}}<1$, or ${\mathbf{n2}}<1$.
${\mathbf{ifail}}=2$
 On entry, ${\mathbf{ntype}}\ne 1$, $2$ or $3$.
${\mathbf{ifail}}=3$
The iterative procedure used in the approximation of the probability for large ${n}_{1}$ and ${n}_{2}$ did not converge. For the two-sided test, $p=1$ is returned. For the one-sided test, $p=0.5$ is returned.
${\mathbf{ifail}}=-99$
${\mathbf{ifail}}=-399$
Your licence key may have expired or may not have been installed correctly.
${\mathbf{ifail}}=-999$
Dynamic memory allocation failed.

## Accuracy

The large sample distributions used as approximations to the exact distribution should have a relative error of less than 5% for most cases.

The time taken by nag_nonpar_test_ks_2sample (g08cd) increases with ${n}_{1}$ and ${n}_{2}$, until ${n}_{1}{n}_{2}>10000$ or $\mathrm{max}\phantom{\rule{0.125em}{0ex}}\left({n}_{1},{n}_{2}\right)\ge 2500$. At this point one of the approximations is used and the time decreases significantly. The time then increases again modestly with ${n}_{1}$ and ${n}_{2}$.

## Example

This example computes the two-sided Kolmogorov–Smirnov test statistic for two independent samples of size $100$ and $50$ respectively. The first sample is from a uniform distribution $U\left(0,2\right)$. The second sample is from a uniform distribution $U\left(0.25,2.25\right)$. The test statistic, ${D}_{{n}_{1},{n}_{2}}$, the standardized test statistic, $Z$, and the tail probability, $p$, are computed and printed.
```function g08cd_example

fprintf('g08cd example results\n\n');

x = [ 1.160 1.785 0.322 1.437 1.695 1.770 1.209 0.479 1.122 0.974 ...
0.290 1.155 0.218 1.595 1.053 1.058 1.282 1.278 1.066 0.725 ...
0.113 1.516 1.329 1.907 0.101 0.387 1.392 0.613 0.692 1.397 ...
1.627 0.417 1.079 0.607 0.899 0.493 0.381 1.660 0.233 0.718 ...
1.376 1.395 1.557 1.610 1.632 0.851 1.824 0.921 0.139 0.618 ...
0.050 0.956 0.669 1.109 1.882 1.462 1.465 0.201 1.036 1.127 ...
0.907 0.876 1.199 1.667 1.141 0.820 0.488 0.732 0.725 0.753 ...
0.760 1.833 0.074 1.101 0.620 1.858 0.681 0.705 0.876 1.096 ...
1.870 1.597 0.990 0.430 0.410 0.399 1.693 0.492 1.318 0.883 ...
1.291 1.051 1.934 1.314 1.496 0.391 1.079 0.881 0.983 1.306];

y = [ 1.695 1.452 0.997 1.771 1.114 1.624 2.005 0.782 1.870 0.954 ...
1.606 2.059 0.774 0.741 1.040 0.521 2.163 0.818 1.781 1.420 ...
0.558 1.437 2.004 1.325 0.398 0.582 2.047 0.332 1.186 0.890 ...
1.825 1.324 1.334 0.261 0.299 1.733 1.172 1.000 1.663 1.093 ...
1.045 2.022 1.174 0.670 1.143 1.189 0.494 1.275 1.122 1.823];

ntype = int64(1);
[d, z, p, sx, sy, ifail] = g08cd(...
x, y, ntype);

fprintf('Test statistic D = %8.4f\n', d);
fprintf('Z statistic      = %8.4f\n', z);
fprintf('Tail probability = %8.4f\n', p);

```
```g08cd example results

Test statistic D =   0.1800
Z statistic      =   0.0312
Tail probability =   0.2222
```