# Monte-Carlo Integration (probabilistic numeric algorithms)

In this section, we are going to present some basic information about the Monte Carlo integration methods. This method must not be confused with the Monte Carlo algorithms that you will study in the remaining part of this course.

1. Monte Carlo Methods (probabilistic numeric algorithms): it is a method which idea is to use random numbers and distributions to solve deterministic problems as could be the integration of an analytical function;
2. Monte Carlo Algorithms: it is a randomized algorithm whose output may be incorrect with a certain probability.

Regarding the Monte Carlo Methods (numeric algorithm) the interested reader can find a nice introduction on the Wikipedia page. For a more complete reference see for example this ArXiv document.

## Direct Sampling Monte Carlo

The direct sampling Monte Carlo, is a method that relies on a repeated random sampling to (for example) do a numerical approximation of an integration. They are often used when neither an analytical solution or a numerical approximation (obtained with non-stochastic methods) is available.

Let's see how to use the Direct-Sampling-Monte-Carlo integration to estimate the value of $\pi$. We can follow these simple steps:

1. Draw a square, then inscribe a circle within it
2. Randomly uniformly scatter a given number of points over the square
3. Count the number of points inside the circle
4. The ratio of the inside-count and the total-sample-count is an estimate of the ratio of the two areas, $\pi/4$. Multiply the result by $4$ to estimate $\pi$.

Given that this problem has to axes of symmetry, we can in the equivalent way study just the upper quadrant that can be represented as a function. So, in general, this procedure allows as to do an estimation of a function integral of the type $$I = \int f(\bf{x}) \rm{d}\bf{x}$$

The Monte Carlo can give also a non-deterministic extimetion of the error, so that in the end it will give an answer of the type

$$I = \int f(\bf{x}) \rm{d} \mathbf{x} = 0.789 \pm 0.001 \text{with probability 0.95}$$

### Exercise: Pi Estimation with Monte-Carlo

By using the pseudo-random number generators of TP 1 (C++ version), calculate Pi with the Monte-Carlo approach. Report the precision of the approximation per method (up to the 6th decimal) for a fixed number of generated points N, N = 5000, 10000, 30000. Do you observe substantial differences by using the different engines?

## Buffon’s needle experiment

Georges Louis Leclerc, Count of Buffon (1707–1788) is considered to be the first to use a Monte Carlo Method to compute the value of $\pi$. His method was based on the random throw of needles of length $\lambda$ onto a wooden floor with cracks at distance $\omega$ apart.

## Markov Chain monte carlo

A Markov process is a process that satisfies the Markov propriety:
$$P(x_{n+1}|x_1,\ldots,x_n) = P(x_{n+1}|x_n)$$

that basically means that since the next state is determined just from the current state, the process has no memory.

A Markov Chain is nothing but a stochastic Markov process where the next element of the sequence is determined stochastically from the current state.

Question 1: Is the Logistic Map a Markov process or a Markov Chain?

Question 2: Can you spot the possible differences between the random walk and the Markov Chain?

Let's define the Markov Chain in a more formal way. Given the stochastic function $Y: \ S\to \{y_1,\ldots,y_n\}$ that associate at each sample point a realization  (where $S$ is the sample space and $\{y_1,\ldots,y_n\}$ the set of realization of $Y$), we denote with $P(y_n(s))$ the probability that at the time $s\in$ $\mathbb{N}$, $Y$ has the realization $y_n$. We also indicate with $P(y_m(s)|y_n(s+1))$ the conditional probability that $Y$ has realization $y_{n2}$ at time $s_2$, given that it had realization $y_{n1}$ at time $s$.