Introduction VII – Law of Large Numbers

Introduction VII – Law of Large Numbers

Our intuition can be mathematical

Sometime maths can be quite an intuitive thing. The Law of Large Numbers is such a case.

Law of Large Numbers (LLN)

The LLN tells us that when the number of trials increases, the sample mean approaches the population mean. Or in more mathematical terms: as \;n\rightarrow \infty,\; \overline { x } _{ n } \rightarrow_{ p }\;\mu or n\rightarrow\infty,\;P(|\overline { X }_{ n }-\mu|<a)=1

\overline { X } _{ n } is the sample mean and is calculated as following: \overline { X }_{ n }=\frac{ X_{ 1 }+X_{ 2 }+X_{ 3 }...+X_{ n } }{ n } =\frac{ 1 }{ n }\sum_{ i=1 }^{ n }{ X_{ i } }

How big does n have to be?

How big n needs to be depends on how big a and p are. Nonetheless in a lot of cases n can be smaller than one might expect. We can demonstrate this with some lines of python code:

from scipy.stats import binom
import matplotlib.pyplot as plt
import numpy as np

#Show that when n increases the probability that abs(X-mu) < a converges 1
for a in np.arange(0.1,0.01,-0.01):
    list_n = []
    list_p = []
    p = 0.5
    n = 10

    while (binom.cdf(int((p+a)*n), n, p)-binom.cdf(int((p-a)*n), n, p)) < 0.99999:
        list_p.append(binom.cdf(int((p+a)*n), n, p)-binom.cdf(int((p-a)*n), n, p))
        list_n.append(n)
        n += 10
        plt.plot(list_n, list_p, label='a={}, p={}'.format(round(a,2),p))
plt.legend()
plt.show()


The above code produces the following output:

lln_a

We can see that for a probability of 0.5 and a=0.1, n just needs to be around 500.

The LLN has a lot of areas where it can be applied, for example in economics, in finance and in insurance. We will shortly come to a quite similar theory, the central limit theorem.

2 thoughts on “Introduction VII – Law of Large Numbers

Leave a comment