3 dice rolling problem
Terminology
The term 'Probability' literally means chance or odds or expectation or likelihood. The term originates from the medieval Latin word 'probabilis' meaning plausible. Probability is always utilized to study the behavior of stochastic (random) processes like tossing a coin or rolling a dice. Historically probability has always been closely associated with the term chance and used synonymously with it until we had a proper mathematical perspective in the 18th century. The early form of the theory was called the 'Doctrine of Chances'.
Origin
Probability finds its place in the ancient and medieval laws of evidence where they had to deal with proofs, credibility, and uncertainties of evidence in a court. Games of chance are believed to have existed as early as the Egyptian civilization. In the excavations of different tombs of the Pharaohs, they found a game called 'Hounds and Jackals' which matches closely with our modern-day game of 'Snakes and Ladders.' This must have been the early stage of the creation of dice. Throwing around a set of dice and betting on its outcome has been an ancient habit of humans and this has been passed on from one civilization to another. The first dice game found in the literature of the Christian era was known as 'Hazard', played with 2-3 dice. The game was thought to have been brought to Europe by the knights returning from the crusades. The pottery of the Greek civilization showed the existence of some games which involved various degrees of uncertainty. Present-day casinos all other the world have successfully carried on the legend to modern times.
During Renaissance, Europe was a center of gambling and other games of chance which involved a humongous amount of wealth put at stake. Maritime insurance plans were supposed to be estimated based on the risks involved. But during that time (17th century) there was no proper mathematical framework that could provide a logical basis to calculate or predict the degree of risks involved in these. People would always like to know an estimate of returns for the wealth they put at stake. These could only be provided by a sound mathematical theory that could take into account the randomness (unpredictability) of the systems and thoroughly perform a risk analysis. Under such situations, the stage was set for the mathematicians of that era to step forward and start developing a mathematical framework that would fulfill the needs of the prevailing society.
The early form of the theory
The early form of the mathematical theory of probability can be attributed to four mathematicians of that era, namely, the Italian polymath Gerolamo Cardano, the French mathematicians Pierre de Fermat and Blaise Pascal, and the Dutch mathematician Christiaan Huygens. Cardano began his investigations as early as 1560, but his work was unknown to society for 100 years. He basically put his thoughts into investigating the sum of the numbers obtained from the throws of three dice. The randomness involved amazed him and he tried to find a pattern in it. This was such a booming topic in those days that Galileo could not stay away from it. In the early 17th century he considered the problem of throwing 3 dice and declared that it is possible to throw some numbers more often than others because there are more ways to create that number. From the middle of the 17th century, there began a correspondence between Fermat and Pascal aiming to find a solution for the games of chance. This triggered a serious attempt towards the development of a mathematical basis of probability. In 1657 Huygens gave a comprehensive treatment to the concept.
Subsequent developments
In the 18th century, the subject was taken up by the Swiss mathematician Jacob Bernoulli and the French mathematician Abraham De Moivre. In his Arcs Conjectandi (1713), Bernoulli derived the first version of the Law of Large Numbers (LLN), which states that the average of the results obtained from a large number of trials of a random experiment should be close to the expected value and the gap gradually decreases as the number of trails are increased. De Moivre in his Doctrine of Chances (1718) showed the method of calculating a wide range of complex probabilities.
By the 19th century, it was almost evident that the mathematical theory of probability is a powerful mathematical tool having a wide range of real-life applications. The randomness or uncertainties in various activities of human life and natural phenomenon can be well addressed by a well-formulated theory of probability. To re-affirm this idea German mathematician and physicist Gauss used the theory in astronomical studies and the predictions were great. From a few observational data, he determined the orbit of Ceres (a dwarf planet in the asteroid belt between Mars and Jupiter). He used the method of least squares to perform an error analysis to correct the errors in the observations, which became a routine analysis in astronomy thereafter. To do this he used the normal distribution of errors in his calculations, which is a probabilistic tool. In 1812 French scholar and polymath Laplace further developed the theory by introducing fundamental concepts of mathematical expectations which include the moment generating function, the method of least squares, and the testing of hypothesis. From here the mathematical theory of probability took a turn and slowly started to develop a bonding with the mathematical theory of statistics. It was understood that the two concepts are related and one cannot do without the other.

Dwarf planet Ceres
Probability in Physics
By the end of the 19th century, physics was gaining ground and was the leading science of the era. Classical mechanics developed by Newton, Lagrange, Hamilton, etc were no longer valid for the sub-atomic worlds. Science needed new physical theories to describe the observations. Who knew that the mathematical theory of probability will form the cornerstone of the new theories to come? It was found that the properties of gases like the temperature could only be expressed in terms of the motions of a large number of particles. This could not be done without the help of statistics as the number of particles that we are talking about here is huge. Ludwig Boltzmann and J. Willard Gibbs developed the field of Statistical Mechanics to address the problem, which involved the concepts of probabilities and statistics.
Gas particles
The laws followed by the sub-atomic (micro) particles were queer and totally different from the classical world. To address this issue Quantum Mechanics was developed in the 20th century by people like Max Planck, Albert Einstein, Niels Bohr, Werner Heisenberg, Erwin Schrodinger, Paul Dirac, Wolfgang Pauli, Richard Feynman, etc. The basis of the modern quantum theory is the Uncertainty principle proposed by Heisenberg and it is built on the concepts of probability.
Probability in today's World
The twentieth century saw the mathematical theory of probability develop leaps and bounds. One of the basic problems of probability is finding a formal unambiguous definition of the mathematical origin. The theory is so realistic, obvious, and application-based that it was really tough to find a theoretical definition of probability. The classical definition was initially formed which was far from being sufficient and made way for the frequency definition. Finally, the frequency definition was replaced by the axiomatic definition given by the Russian mathematician Andrey Kolmogorov in 1933. The axiomatic definition is based on three axioms which are logical and accepted worldwide. It settled the long-standing disputes between mathematicians regarding the definition of probability.
Probability and Statistics found a link and came together through the concept of hypothesis testing introduced by the Polish mathematician Jerzy Neyman and the British polymath R.A. Fisher. In modern times the concept of hypothesis testing is applied in various fields like biological and psychological experiments. It is used in the clinical trial of drugs and also in economics. Nowadays the idea of probability is used in concepts like the Markov process, Brownian motion, and other places where we have to deal with an aggregate of entities. Random fluctuations of stock markets are studied using probabilistic mathematical models to provide predictions for investors. So Mathematical finance has emerged as a new area of mathematics.



The modern era is an era of computer simulations, artificial intelligence, quantum computing, data science, etc. In almost all these areas the mathematical theory of probability plays a significant role. It is understandable that there is still a lot of room for development. Mathematicians all over the world work on stochastic models with the aim of improving the theory and increasing its applicability. We hope that a theory that developed from within the human society out of utmost necessity, will continue to develop and help humanity reach new heights in science and technology.
By
Prabir Rudra