[Python] Law of Large Numbers: Dice Roll
So Einstein was wrong when he said, “God does not play dice.” Consideration of black holes suggests, not only that God does play dice, but that he sometimes confuses us by throwing them where they can’t be seen.
I would recommend reading this post about the Law of Large Numbers before continuing with this one:
I want to run a quick and easy experiment which is in line with the Law of Large Numbers. That is to roll a dice for several times and see whether the average converges to the true value. One would expect that with more trials, the average of all dice rolls should converge to the true value, which is 3.5. Let’s see if that is actually the case.
Here is how it goes: A dice will be rolled several times, and the cumulative average is going to be calculated for every toss.
This code is very simple. First, I have created a function which simulates rolling a dice: The function randomly chooses a number from all possible outcomes of a coin toss: [1, 2, 3, 4, 5, 6]). The function will then add the results to a pre-defined vector, which I will later use to calculate the averages from. Then the cumulative sum of the results is calculated with Numpy’s cumsum function, so that the averages can be calculated. A graph shows the final results.
After just 10 trials, the average is already converging very quickly, standing at 3.8. First dice roll was a 5.
With 100 rolls, the average is already very close to true value.
After 100 trials the average falls to about 3.25 and then again converges to 3.5.
With one million dice rolls, the average is very very close to the true value, being 3.4998015132783884