Probability Laws Derived from the Gamma Function

Abstract

Several densities or probability laws of continuous random variables derive from the Euler Gamma function. These laws form the basis of sampling theory, namely hypothesis testing and estimation. Namely the gamma, beta, and Student law, through the chi-square law and the normal law are all distributions resulting from applications of Euleur functions.

Keywords

Share and Cite:

Toure, L. and Conde, S. (2024) Probability Laws Derived from the Gamma Function. Open Journal of Statistics, 14, 106-118. doi: 10.4236/ojs.2024.141005.

1. Introduction

Application of the functions of Euler contributed to and facilitated the obtaining of important results in statistics and especially in the theories of distribution of sampling. In this paper we study some properties of the functions Gama and Beta, in the second part we also study some laws of probabilities or probability distributions derived from the function gamma and in the last part, the application concerns the laws of probabilities of certain random variables from normal populations.

2. Gamma Function

2.1. Definition

Gamma Function Γ is defined by in [1] :

$\Gamma \left(n\right)={\int }_{0}^{\infty }{\text{e}}^{-t}{t}^{n-1}\text{d}t$ (1)

is an improper integral of the third kind. which converges if $n>0$ and diverges if $n\le 0$ .

Fundamental Relationship

$\Gamma \left(n+1\right)=n\Gamma \left(n\right)$ .

Indeed: $\begin{array}{c}\Gamma \left(n+1\right)={\int }_{0}^{\infty }{\text{e}}^{-t}{t}^{n}\text{d}t\\ ={\left[-{\text{e}}^{-t}{t}^{n}\right]}_{0}^{\infty }+n{\int }_{0}^{\infty }{\text{e}}^{-t}{t}^{n-1}\text{d}t\end{array}$ but ${\left[-{\text{e}}^{-t}{t}^{n}\right]}_{0}^{\infty }=0$ if $t=0$ or $\infty$ .

We have

$\Gamma \left(1\right)={\int }_{0}^{\infty }{\text{e}}^{-t}\text{d}t=1$ .

Therefore, for all n integer:

$\Gamma \left(n+1\right)=n\Gamma \left(n\right)=n\left(n-1\right)\Gamma \left(n-2\right)=n!\Gamma \left(1\right)=n!$

$\Gamma \left(n+1\right)=n!$ (2)

2.2. Asymptotic Formula for Γ(n)

If n is large, the difficulties inherent in calculating $\Gamma \left(n\right)$ are obvious. A useful result in such a case provided by the relation

$\Gamma \left(n+1\right)=\sqrt{2\pi n}{n}^{n}{\text{e}}^{-n}{\text{e}}^{\theta /12\left(n+1\right)}$ (3)

for $0<\theta <1$ .

For practical applications, the last factor which is too close to 1 for large n can be omitted.

$\Gamma \left(n+1\right)=\sqrt{2\pi n}{n}^{n}{\text{e}}^{-n}$

2.3. Beta Function

2.3.1. Definition

$B\left(p,q\right)=\frac{\Gamma \left(p\right)\Gamma \left(q\right)}{\Gamma \left(p+q\right)}$ (4)

2.3.2. Integral Expression of the Beta Function

$\Gamma \left(p\right)={\int }_{0}^{\infty }{\text{e}}^{-t}{t}^{p-1}\text{d}t=2{\int }_{0}^{\infty }{\text{e}}^{-{u}^{2}}{u}^{2p-1}\text{d}u$ with $t={u}^{2}$ .

Therefore:

$\begin{array}{c}\Gamma \left(p\right)\Gamma \left(q\right)=4{\int }_{0}^{\infty }{\int }_{0}^{\infty }{\text{e}}^{-{u}^{2}}{u}^{2p-1}\text{d}u\text{\hspace{0.17em}}{\text{e}}^{-{v}^{2}}{v}^{2q-1}\text{d}v\\ =4{\int }_{0}^{\infty }{\int }_{0}^{\infty }{\text{e}}^{-{u}^{2}+{v}^{2}}{u}^{2p-1}{v}^{2q-1}\text{d}u\text{d}v\end{array}$ .

In polar coordinates, for $u=\rho \mathrm{cos}\left(\theta \right)$ and $v=\rho \mathrm{sin}\left( \theta \right)$

$\begin{array}{c}\Gamma \left(p\right)\Gamma \left(q\right)=4{\int }_{\rho =0}^{\infty }{\int }_{\theta =0}^{\pi /2}{\text{e}}^{-{\rho }^{2}}{\rho }^{2p-1+2q-1}{\left(\mathrm{cos}\left(\theta \right)\right)}^{2p-1}{\left(\mathrm{sin}\left(\theta \right)\right)}^{2q-1}\rho \text{d}\rho \text{d}\theta \\ =4{\int }_{\rho =0}^{\infty }{\text{e}}^{-{\rho }^{2}}{\rho }^{2\left(p+q\right)-1}{\int }_{\theta =0}^{\pi /2}{\left(\mathrm{cos}\left(\theta \right)\right)}^{2p-1}{\left(\mathrm{sin}\left(\theta \right)\right)}^{2q-1}\text{d}\rho \text{d}\theta \\ =2\Gamma \left(p+q\right){\int }_{\theta =0}^{\pi /2}{\left(\mathrm{cos}\left(\theta \right)\right)}^{2p-1}{\left(\mathrm{sin}\left(\theta \right)\right)}^{2q-1}\text{d}\theta \end{array}$

so

$\beta \left(p,q\right)=2{\int }_{\theta =0}^{\pi /2}{\left(\mathrm{cos}\left(\theta \right)\right)}^{2p-1}{\left(\mathrm{sin}\left(\theta \right)\right)}^{2q-1}\text{d}\theta$ (5)

In particular

$B\left(1/2,1/2\right)=\frac{{\left[\Gamma \left(1/2\right)\right]}^{2}}{\Gamma \left(1\right)}={\left[\Gamma \left(1/2\right)\right]}^{2}=2{\int }_{0}^{\pi /2}\text{d}\theta =\pi$

$\Gamma \left(1/2\right)=\sqrt{\pi }$ (6)

Passing in Cartesian coordinates, so by posing ${\mathrm{cos}}^{2}\theta =t$ , we find:

$B\left(p,q\right)={\int }_{0}^{1}{t}^{p-1}{\left(1-t\right)}^{q-1}\text{d}t$ (7)

3. Randon Variable

In this section we will define random variables and some of its characteristics.

A rando variableis any function that assigns a numerical value to each possible outcome.

Probability Density Function

We describe in [2] and [3] the behavior of a continuous random variable X by specifying its probability density function which satisfies $f\left(x\right)\ge 0$ $\forall \text{ }x$ and ${\int }_{-\infty }^{\infty }f\left(x\right)\text{d}x=1$ .

Remember that it is only meaningful to talk about the probability that a continuous random variable X lies in an interval. It is always the case that $P\left(X=x\right)=0$ for every possible value x.

Obtain the probability that the value of X will lie in an interval by finding the area over the interval. $P\left(X\le b\right)={\int }_{-\infty }^{b}f\left(x\right)\text{d}x$ = area under the density function to the left of $x=b$ . $P\left(a\le b\right)={\int }_{a}^{b}f\left(x\right)\text{d}x$ = area under the density function between $x=a$ and $x=b$ .

Summarize a probability density of the continuous random variable X by its:

mean: $\mu ={\int }_{-\infty }^{+\infty }xf\left(x\right)\text{d}x$ .

variance: ${\sigma }^{2}={\int }_{-\infty }^{+\infty }{\left(x-\mu \right)}^{2}f\left(x\right)\text{d}x$ .

4. Some Probability Laws Derived from the Gama Functions

Distributions derived from the Gamma law are distributions that arise from the gamma law through transformations or combinations with other distributions. These distributions are used in various fields such as failure time modeling in engineering, survival analysis, econometrics and other applications where positive continuous random variables are involved.

4.1. Gamma Distribution

This distribution plays an important role in statistics.

4.1.1. Definition

In [3] and [4] A random variable X is said to be distributed as the gamma distribution of parameter theta if its density is for $\theta >0$ and $\alpha >0$

$f\left(x\right)=\frac{{\theta }^{\alpha }}{\Gamma \left(\alpha \right)}{\text{e}}^{-\theta x}{x}^{\alpha -1}\text{ }\text{\hspace{0.17em}}\text{ }\text{for}\text{ }\text{\hspace{0.17em}}0 (8)

This function represents a density, because by definition of $\Gamma \left(\alpha \right)$ (see Equation (1)), ${\int }_{0}^{\infty }f\left(x\right)\text{d}x=1$ .

By definition, $E\left(X\right)={\int }_{0}^{\infty }xf\left(x\right)\text{d}x$ .

We have,

$E\left(X\right)=\frac{{\theta }^{\alpha }}{\Gamma \left(\alpha \right)}{\int }_{0}^{\infty }{\text{e}}^{-\theta x}{x}^{\alpha }\text{d}x=\frac{1}{\Gamma \left(\alpha \right)}{\int }_{0}^{\infty }{\text{e}}^{-y}{y}^{\alpha }\text{d}y/\theta =\frac{\Gamma \left(\alpha +1\right)}{\theta \text{ }\Gamma \left(\alpha \right)}=\alpha /\theta$

$V\left(X\right)=E\left({X}^{2}\right)-{E}^{2}\left(X\right)$ .

We have, $V\left(X\right)=\frac{{\theta }^{\alpha }}{\Gamma \left(\alpha \right)}{\int }_{0}^{\infty }{\text{e}}^{-\theta x}{x}^{\alpha +1}\text{d}x-{\left(\alpha /\theta \right)}^{2}=\frac{1}{\Gamma \left(\alpha \right)}{\int }_{0}^{\infty }{\text{e}}^{-y}{y}^{\alpha +1}\text{d}y/{\theta }^{2}-{\left(\alpha /\theta \right)}^{2}$ hence, $\begin{array}{c}V\left(X\right)=\frac{\Gamma \left(\alpha +2\right)}{{\theta }^{2}\Gamma \left(\alpha \right)}-{\left(\alpha /\theta \right)}^{2}=\left(\alpha +1\right)\frac{\Gamma \left(\alpha +1\right)}{{\theta }^{2}\Gamma \left(\alpha \right)}-{\left(\alpha /\theta \right)}^{2}\\ =\frac{1}{{\theta }^{2}}\left(\alpha \left(\alpha +1\right)-{\alpha }^{2}\right)=\alpha /{\theta }^{2}\end{array}$ .

4.1.2. Application

Let’s study the law of the variable $Y=\theta X$ Let $G\left(y\right)$ be the distribution function of the variable Y.

By definition: $G\left(y\right)=P\left(Y where F est la fonction de repartition de la variable X; la densité de probabilité de Y est obtenue par dérivation: Soit $g\left(y\right)$ la densité de probabilité de la variable Y. $g\left(y\right)=\frac{1}{\theta }f\left(\frac{y}{\theta }\right)=\frac{{\theta }^{p-1}}{\Gamma \left(p\right)}{\text{e}}^{-y}{\left(\frac{y}{\theta }\right)}^{p-1}=\frac{1}{\Gamma \left(p\right)}{\text{e}}^{-y}{y}^{p-1},y>0$

4.2. Remarks

If the scale parameter $\theta$ is equal to 1, we write $\Gamma \left(\alpha ,1\right)$ or $\Gamma \left(\alpha \right)$ ; if $\theta =1$ , the random variable $Y=\theta X$ follows $\Gamma \left(\alpha \right)$ .

The density of a $\Gamma \left(\alpha \right)$ law is:

$f\left(x\right)=\frac{1}{\Gamma \left(\theta \right)}{\text{e}}^{-x}{x}^{\theta -1}$ (9)

If $\alpha =1$ , the gamma law $\Gamma \left(1,\theta \right)$ is called an exponential law with parameter $\theta$ .

4.3. The Beta Distribution

4.3.1. Type I Beta Distribution

The beta distribution is the distribution of an X; $0 , dependent on two parameters n and p, whose density is:

$f\left(x\right)=\frac{1}{B\left(n,p\right)}{x}^{n-1}{\left(1-x\right)}^{p-1}$ (10)

$n,p>o$ ;

where $B\left(n,p\right)=\frac{\Gamma \left(n\right)\Gamma \left(p\right)}{\Gamma \left(n+p\right)}$ then

$f\left(x\right)=\frac{\Gamma \left(n+p\right)}{\Gamma \left(n\right)\Gamma \left(p\right)}{x}^{n-1}{\left(1-x\right)}^{p-1}$ (11)

As for the previous distribution we find: $E\left(X\right)=\frac{n}{n+p}$ and $V\left(X\right)=\frac{np}{\left(n+p+1\right){\left(n+p\right)}^{2}}$

4.3.2. The Type II Beta Distribution

Let X be a random variable following a beta distribution of beta $I\left(n,p\right)$ , then by definition, $Y=X/\left(1-X\right)$ follows a type II beta distribution whose density is easily obtained by changing the variable.

$f\left(y\right)=\frac{1}{B\left(n,p\right)}\frac{{y}^{n-1}}{{\left(1+y\right)}^{n+p}}$ (12)

From the properties of the function Γ we easily deduce the moments of Y

$E\left(Y\right)=\frac{1}{B\left(n,p\right)}{\int }_{0}^{+\infty }\frac{{y}^{n}}{{\left(1+y\right)}^{n+p}}\text{d}y=\frac{B\left(n+1,p-1\right)}{B\left(n,p\right)}=\frac{n}{p-1},\text{ }\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{for}\text{ }\text{\hspace{0.17em}}p>1$

$E\left({Y}^{2}\right)=\frac{1}{B\left(n,p\right)}{\int }_{0}^{+\infty }\frac{{y}^{n+1}}{{\left(1+y\right)}^{n+p}}\text{d}y=\frac{B\left(n+2,p-2\right)}{B\left(n,p\right)}=\frac{n\left(n+1\right)}{\left(p-1\right)\left(p-2\right)},\text{ }\text{\hspace{0.17em}}\text{for}\text{\hspace{0.17em}}p>2$ .

4.4. The Normal Distribution

One of the most important continuous probability distribution is the normal distribution, normal curve or Gaussian distribution defined by the equation:

$f\left(x\right)=\frac{1}{\sigma \sqrt{2\pi }}{\text{e}}^{-\frac{1}{2}{\left(\frac{x-\mu }{\sigma }\right)}^{2}},x\in ℝ.$ (13)

where $\mu$ = mean, $\sigma$ = standard deviation, $\pi =3.14159\cdots$ , $\text{e}=2.71828\cdots$ $f\left(x\right)>0,\forall x\in ℝ$ ${\int }_{-\infty }^{+\infty }f\left(x\right)\text{d}x=1$ .

Indeed

$\begin{array}{c}{\int }_{-\infty }^{+\infty }f\left(x\right)\text{d}x={\int }_{-\infty }^{+\infty }\frac{1}{\sigma \sqrt{2\pi }}{\text{e}}^{-\frac{1}{2}{\left(\frac{x-\mu }{\sigma }\right)}^{2}}\text{d}x\\ ={\int }_{-\infty }^{+\infty }\frac{1}{\sqrt{2}}{\text{e}}^{-\frac{1}{2}{z}^{2}}\text{d}z\end{array}$ .

With the change of variable:

$Z=\frac{X-\mu }{\sigma }$ .

We have:

${\int }_{-\infty }^{+\infty }\frac{1}{\sqrt{2}}{\text{e}}^{-\frac{1}{2}{z}^{2}}\text{d}z=1$ .

THEOREME 1. If $X~N\left(\mu ,\sigma \right)⇔Z=\frac{X-\mu }{\sigma }~N\left(0,1\right)$ .

The variable $Z=\frac{X-\mu }{\sigma }~N\left(0,1\right)$ .

$f\left(z\right)=\frac{1}{\sqrt{2\pi }}{\text{e}}^{-\frac{1}{2}{z}^{2}}$ is the probability density of the variable Z We will show that the variance of Z is equal to 1. $\begin{array}{c}Var\left(Z\right)={\int }_{-\infty }^{+\infty }{z}^{2}\frac{1}{\sqrt{2\pi }}{\text{e}}^{-\frac{1}{2}{z}^{2}}\text{d}z\\ =\frac{2}{\sqrt{2\pi }}{\int }_{0}^{+\infty }{z}^{2}\mathrm{exp}\left(-\frac{1}{2}{z}^{2}\right)\end{array}$ let’s put $t={z}^{2}/2$ , the $z\text{d}z=\text{d}t$ : $=\frac{2}{\sqrt{\pi }}{\int }_{0}^{+\infty }{\text{e}}^{-t}=\frac{2}{\sqrt{\pi }}\Gamma \left(\frac{3}{2}\right)=\frac{2}{\sqrt{\pi }}\frac{1}{2}\Gamma \left( 1 2 \right)$

$\Gamma \left(1/2\right)=\sqrt{\pi }$ (see Equation (6)) then $V\left(Z\right)=1$ .

4.5. Chi-Square Distribution

When $\alpha$ is a positive integer in the Gamma distribution, we obtain the chi-square distribution, used in statistical tests and confidence interval estimates. We can also difine by: The Chi Square distribution is the distribution of the sum of squared standard normal deviates. The degrees of freedom of the distribution is equal to the number of standard normal deviates being summed.

We say that X follows a Chi-square distribution with $\nu$ degrees of freedom, denote ${\chi }_{\nu }^{2}$ , if the probability density function of X is:

$f\left(x\right)=\frac{1}{{2}^{\nu /2}\Gamma \left(\nu /2\right)}{\text{e}}^{-x/2}{x}^{\nu /2-1}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{for}\text{\hspace{0.17em}}\text{ }\nu \in \aleph$ (14)

The chi-square distribution is another distribution of Gamma, Indeed for $\theta =1/2$ and $\alpha =\nu /2$ (in Equation (8)). Therefore: $E\left(X\right)={\int }_{0}^{\infty }xf\left(x\right)\text{d}x=\nu$ and $V\left(X\right)={\int }_{0}^{\infty }{x}^{2}f\left(x\right)\text{d}x-{E}^{2}\left(X\right)=2\nu$ .

4.6. The Fisher ’F Distribution

This law is related o the ratio of two independent quadratic forms. Suppose that ${\chi }_{1}$ and ${\chi }_{2}$ are independently distributed by chi-square distributions with ${\nu }_{1}$ and ${\nu }_{2}$ degrees of freedom, respectively.

We define

$F=\frac{{\chi }_{1}/{\nu }_{1}}{{\chi }_{2}/{\nu }_{2}}$ (15)

its density function is:

$g\left(f\right)=\frac{1}{\beta \left(\frac{{\nu }_{1}}{2}\frac{{\nu }_{2}}{2}\right)}{\left(\frac{{\nu }_{1}}{{\nu }_{2}}\right)}^{\frac{{\nu }_{1}}{2}}\frac{{f}^{\frac{{\nu }_{1}-2}{2}}}{{\left(1+\frac{{\nu }_{1}}{{\nu }_{2}}f\right)}^{\frac{{\nu }_{1}+{\nu }_{2}}{2}}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{for}\text{ }\text{\hspace{0.17em}}t\ge 0$ (16)

$E\left(F\right)=\frac{{\nu }_{2}}{{\nu }_{2}-2}$ (17)

and

$Var\left(F\right)=2\frac{{\nu }_{2}^{2}}{{\nu }_{1}}\frac{{\nu }_{1}+{\nu }_{2}-2}{{\left({\nu }_{2}-2\right)}^{2}\left({n}_{2}-4\right)}$ (18)

4.7. Student ‘t’ Distribution

Another distribution of considerable practical importance is that of the ratio of a normally distributed variate to the root of a a variate independently distributed by chi-square distribution.

More precisely, if X is normally distributed with mean $\mu$ and variance ${\sigma }^{2}$ , if U has the chi-square distribution with $\nu$ degrees of freedom, and if X and U are independent distributed, we seek the distribution of. Probability density function Student’s t-distribution has the probability density function given by

$f\left(t\right)=\frac{1}{\sqrt{\nu \pi }}\frac{\Gamma \left(\frac{\nu +1}{2}\right)}{\Gamma \left(\frac{\nu }{2}\right)}\frac{1}{{\left(1+\frac{{t}^{2}}{\nu }\right)}^{\frac{\nu +1}{2}}},\text{ }\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{for}\text{ }\text{\hspace{0.17em}}\text{ }-\infty (19)

$E\left(t\right)=0,\text{ }\text{\hspace{0.17em}}\forall \text{ }\nu >1$ (20)

$Var\left(t\right)=\frac{\nu }{\nu -2}\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{for}\text{\hspace{0.17em}}\text{ }\nu >2$ (21)

$t=\frac{\left(x-\mu \right)/\sigma }{\sqrt{u/\nu }}$ (22)

5. The Sampling Distribution

5.1. Distribution of Sample Variance

The variance s of a sample is also a random variable see [5] and [6] . The variance of a sample is given by the formula:

${s}^{2}=\underset{i=1}{\overset{n}{\sum }}\frac{{\left({x}_{i}-\stackrel{¯}{x}\right)}^{2}}{n-1}$ (23)

We also know that for a sample of size n derived from a normal population of mean $\mu$ and variance ${\sigma }^{2}$ , the quantity $\underset{i=1}{\overset{n}{\sum }}\text{ }\text{ }{z}_{i}^{2}=\underset{i=1}{\overset{n}{\sum }}{\left(\frac{{x}_{i}-\mu }{\sigma }\right)}^{2}$

follows the chi-square law with n degrees of freedom. Any sum of squares of random variables is associated with a number of degrees of freedom. Thus, the sum $\underset{i=1}{\overset{n}{\sum }}{\left({x}_{i}-\mu \right)}^{2}$ has a n degree of freedom, but $\underset{i=1}{\overset{n}{\sum }}{\left({x}_{i}-\stackrel{¯}{x}\right)}^{2}$ has only $\left(n-1\right)$ degrees of freedom. Then the ratio $\frac{\left(n-1\right){s}^{2}}{{\sigma }^{2}}=\underset{i=1}{\overset{n}{\sum }}\frac{{\left({x}_{i}-\stackrel{¯}{x}\right)}^{2}}{{\sigma }^{2}}$ follows a law of ${\chi }^{2}$ a $\left(n-1\right)$ degrees of freedom. ${\chi }^{2}=\frac{\left(n-1\right){s}^{2}}{{\sigma }^{2}}$

5.2. The Distribution of the Quotient of Two Variances

Be two normal populations of ${\sigma }_{1}^{2}$ and ${\sigma }_{2}^{2}$ variances respectively. We take two independent samples of size ${n}_{1}$ and ${n}_{2}$ , respectively. We know that X follows the chi-square law with (n − 1) degrees of freedom. Therefor $F=\frac{\frac{\left({n}_{1}-1\right){s}_{1}^{2}/\left({n}_{1}-1\right)}{{\sigma }_{1}^{2}}}{\frac{\left({n}_{2}-1\right){s}_{2}^{2}/\left({n}_{2}-1\right)}{{\sigma }_{2}^{2}}}=\frac{{s}_{1}^{2}/{\sigma }_{1}^{2}}{{s}_{2}^{2}/{\sigma }_{2}^{2}}$ is distributed as F with $\left({n}_{1}-1\right)$ and $\left({n}_{2}-1\right)$ degrees of freedom.

5.3. Distribution of the Quantity $Z=\frac{\stackrel{¯}{X}-\mu }{s/\sqrt{n}}$

Let Z be a normal variate with means 0 and variance 1.

Let $U=\frac{\left(n-1\right){s}^{2}}{{\sigma }^{2}}$ be a chi-square variable with $\left(n-1\right)$ degrees of freedom and let U and Z be independent. Then the random variable $t=\frac{z\sqrt{n-1}}{\sqrt{u}}$ is distributed as Student’s with $\left(n-1\right)$ degrees of freedom.

6. Case Studies

As we said in the introduction, distributions derived from the gamma function are very important tools in the theory of statistical tests.

Our first example relates to the life expectancy of men and women in Guinea, and through the Student test, we knew that in general women live longer than men in Guinea.

The second concerns the improvement of safety conditions in a company before and after certain measures. After studies have proven that safety has been improved.

The third concerns the health conditions of children under 5 years old after a change in political regime.

The last example concerns the distribution of single-member deputies by gender and administrative region in 2013 in Guinea. We want to know if there is a dependency relationship between regions and election by gender (sex).

Example 1

The following example concerns the life expectancy of 25 people including 11 males and 14 females (2016 Guinea Statistical Yearbook).

male: 60.70 59.50 62.15 63.14 60.29 60.48 60.64 61.77 60.93 61.05 59.40.

female: 59.78 64.55 63.45 62.78 59.64 61.59 61.89 62.51 62.67 62.82 63.88 63.07 63.17 62.40.

We want to test the hypothesis that men and women have the same life expectancy. To do this, we will compare the mean age (life expectancy) of men (n = 11) with the average age of women (n = 14). As shown in Figure 1, the distribution of age (life expectancy) follows a normal distribution in both samples and the table confirms by the normality test that these data follow a normal distribution. Once the Student’s t test conditions have been met, we can use the SPSS procedure (mean comparison) to compare the averages using the t-test for independent samples.

The software SPSS gives us the following results The procedure gives us for each sample the main descriptive parameters, the number of subjects, the average, the variance and the standard deviation of each sample Figure 2.

Figure 1. Test of normality.

Figure 2. Table Test of independent.

There is also the difference between the average life expectancy between the two subjects, equal to (−1.52922), which shows that female subjects have a higher life expectancy than male subjects.

Then, for a significance level $\alpha =0.05$ , the equality of variances test gives us a P-value of 0.608 above the threshold that allows us not to reject the variance equality assumption. Then, for the same threshold, the equality test of the averages, we provided a P-value = 0.006 below the significance level and from this result we conclude that the relationship between life expectancy and gender is significant, finally.

We say that in Guinea women live longer than men.

Example 2

The following are the average weekly losses of worker hours due to accidents in 10 industrial plans before and after a certain safety program was put into operation:

Before: 45 73 46 124 33 57 83 34 26 17.

After: 36 60 44 119 35 51 77 29 24 11.

Use the 0.05 level of significance to test whether the safety program is effective.

We cannot apply the independent samples test because the before and after weekly losses of worker hours in the same industrial plan are correlated.

Here there is the obvious pairing of these two observations.

 Null hypothesis: ${\mu }_{D}=0$ ,

Alternative hypothesis ${\mu }_{D}>0$ .

 Level of significance $\alpha =0.05$ .

 Criterion: Reject the null hypothesis if ${t}_{ob}>1.833$ , the value of ${t}_{0.05}$ for 10 – 1 = 9 degrees of freedom, where ${t}_{ob}=\frac{\stackrel{¯}{D}-0}{{S}_{D}/\sqrt{n}}$ and $\stackrel{¯}{D}$ and ${S}_{D}$ are the mean and the standard deviation of the differences.

 Calculations of the differences are:

9 13 2 5 −2 6 6 5 2 6

their mean is $\stackrel{¯}{d}=5.2$ , their standard deviation is ${s}_{D}=4.08$ , so that ${t}_{ob}=\frac{5.2-0}{4.08/\sqrt{10}}=4.03$ .

 Decision: Since ${t}_{ob}=4.03$ exceeds 1.833, the null hypothesis must be rejected at level $\alpha =0.05$ . We conclude that the industrial safety program is effective.

Example 3

The following data represent the number(hundred) of children under five considered chronically malnourished, according to the natural areas in Guinea, before and after the change of political regimen.

We want to know if the sanitary conditions have changed after the change of political regime.

Before: 26.7 21.0 31.1 43.1 34.5 34.6 31.7 40.0

After: 28.1 14.6 30.7 31.9 30.5 36.9 40.8 37.9

(2016 Guinea Statistical Yearbook)

 Null hypothesis: ${\mu }_{D}=0$ ,

Alternative hypothesis ${\mu }_{D}>0$ .

 Level of significance $\alpha =0.05$ .

 Criterion: Reject the null hypothesis if ${t}_{ob}>1.86$ , the value of ${t}_{0.05}$ for 8 – 1 = 7 degrees of freedom, where ${t}_{ob}=\frac{\stackrel{¯}{D}-0}{{S}_{D}/\sqrt{n}}$ and $\stackrel{¯}{D}$ and ${S}_{D}$ are the mean and the standard deviation of the differences.

 Calculations of the differences are:

−1.4 6.4 0.4 11.2 4 −2.3 −9.1 2.1

their mean is $\stackrel{¯}{d}=1.41$ , their standard deviation is ${s}_{D}=6.11$ , so that ${t}_{ob}=\frac{1.41-0}{6.11/\sqrt{8}}=0.654$ .

 Decision: Since ${t}_{ob}=0.654$ not exceed 1.86, the null hypothesis should not be rejected at level $\alpha =0.05$ . We conclude that the health policy has not been improved despite the change of political regime.

Example 4

The following table shows the distribution of single-member deputies by sex administrative region in 2013 in Guinea (2016 Guinea Statistical Yearbook).

In the example above, the null hypothesis is translated by the absence of binding between sex and region.

 Null hypothesis: absence of binding between sex and region Alternative hypothesis sex and region are dependent.

 Level of significance $\alpha =0.05$ .

 Criterion: Reject the null hypothesis if ${\chi }^{2}>14.067$ , the value of ${\chi }_{0.05,7}^{2}$ for (2 − 1)(8 − 1) – 1 = 7 degrees of freedom, where ${\chi }^{2}$ is given by the formula above.

 Calculations Calculating the expected cell frequencies, we get: ${e}_{11}=\frac{23×85}{165}=11.9879$ , ${e}_{11}=\frac{19×85}{165}=9.9030$ , by analogy other frequencies are: ${e}_{13}=11.4667$ , ${e}_{14}=11.4667$ , ${e}_{15}=10.9455$ , ${e}_{16}=9.3818$ , ${e}_{17}=8.8606$ , ${e}_{18}=\frac{23×85}{165}=11.9879$ , ${e}_{21}=11.0121$ , ${e}_{22}=9.0970$ , ${e}_{23}=10.5333$ , ${e}_{24}=10.5333$ , ${e}_{25}=10.0545$ , ${e}_{26}=8.6182$ , ${e}_{27}=8.1394$ and ${e}_{28}=11.0121$ .

After the calculations we find ${\chi }^{2}$ using the last equation.

 Decision: Since ${\chi }^{2}=12.959<14.067$ , the null hypothesis must not be rejected at level $\alpha =0.05$ .

We conclude that the sex and region there is the absence of binding between sex and region.

The SPSS software gives the same result.

With SPSS, we have the value of P = 0.073 for a bilateral test or 0.146 for a unilateral test for α ≤ P-value we accept, the hypothesis null and we reject it otherwise, As P-value > α, we accept the null hypothesis $P=0.146>0.05$ .

Conflicts of Interest

The authors declare no conflicts of interest.

 [1] Piskonov, N. (1980) Differential and Integral Calculus. Mir, Moscou. [2] Graybill, F.A. and Mood, A.M. (1963) An Introduction Theory of Statistics. 2nd Edition, McGraw-Hill Book Company, New York. [3] Freud, M. (2005) Probability and Statistics for Engineers. 7th Edition, House of Electronics Industry, Beijing. [4] Saporta, G. (2006) Probabilites Analyse des Donnees et statistiques. 2nd Edition, Technip, Paris. [5] Bertrand, F. and Bertrand M.M. (2011) Statistique en 80 fiches pour les Scientifiques. DUNOD, Paris. [6] Avenel, M. and Riffault, F.J. (2005) Mathematiques Appliquées à la gestion. Sup’FOUCHER Vanves.