3 dice rolling problem
Terminology
The term 'Probability' literally means chance or odds or expectation or likelihood. The term originates from the medieval Latin word 'probabilis' meaning plausible. Probability is always utilized to study the behavior of stochastic (random) processes like tossing a coin or rolling a dice. Historically probability has always been closely associated with the term chance and used synonymously with it until we had a proper mathematical perspective in the 18th century. The early form of the theory was called the 'Doctrine of Chances'.
Origin
The early form of the theory
The early form of the mathematical theory of probability can be attributed to four mathematicians of that era, namely, the Italian polymath Gerolamo Cardano, the French mathematicians Pierre de Fermat and Blaise Pascal, and the Dutch mathematician Christiaan Huygens. Cardano began his investigations as early as 1560, but his work was unknown to society for 100 years. He basically put his thoughts into investigating the sum of the numbers obtained from the throws of three dice. The randomness involved amazed him and he tried to find a pattern in it. This was such a booming topic in those days that Galileo could not stay away from it. In the early 17th century he considered the problem of throwing 3 dice and declared that it is possible to throw some numbers more often than others because there are more ways to create that number. From the middle of the 17th century, there began a correspondence between Fermat and Pascal aiming to find a solution for the games of chance. This triggered a serious attempt towards the development of a mathematical basis of probability. In 1657 Huygens gave a comprehensive treatment to the concept.
Subsequent developments
In the 18th century, the subject was taken up by the Swiss mathematician Jacob Bernoulli and the French mathematician Abraham De Moivre. In his Arcs Conjectandi (1713), Bernoulli derived the first version of the Law of Large Numbers (LLN), which states that the average of the results obtained from a large number of trials of a random experiment should be close to the expected value and the gap gradually decreases as the number of trails are increased. De Moivre in his Doctrine of Chances (1718) showed the method of calculating a wide range of complex probabilities.
By the 19th century, it was almost evident that the mathematical theory of probability is a powerful mathematical tool having a wide range of real-life applications. The randomness or uncertainties in various activities of human life and natural phenomenon can be well addressed by a well-formulated theory of probability. To re-affirm this idea German mathematician and physicist Gauss used the theory in astronomical studies and the predictions were great. From a few observational data, he determined the orbit of Ceres (a dwarf planet in the asteroid belt between Mars and Jupiter). He used the method of least squares to perform an error analysis to correct the errors in the observations, which became a routine analysis in astronomy thereafter. To do this he used the normal distribution of errors in his calculations, which is a probabilistic tool. In 1812 French scholar and polymath Laplace further developed the theory by introducing fundamental concepts of mathematical expectations which include the moment generating function, the method of least squares, and the testing of hypothesis. From here the mathematical theory of probability took a turn and slowly started to develop a bonding with the mathematical theory of statistics. It was understood that the two concepts are related and one cannot do without the other.
Probability in Physics
By the end of the 19th century, physics was gaining ground and was the leading science of the era. Classical mechanics developed by Newton, Lagrange, Hamilton, etc were no longer valid for the sub-atomic worlds. Science needed new physical theories to describe the observations. Who knew that the mathematical theory of probability will form the cornerstone of the new theories to come? It was found that the properties of gases like the temperature could only be expressed in terms of the motions of a large number of particles. This could not be done without the help of statistics as the number of particles that we are talking about here is huge. Ludwig Boltzmann and J. Willard Gibbs developed the field of Statistical Mechanics to address the problem, which involved the concepts of probabilities and statistics.
The laws followed by the sub-atomic (micro) particles were queer and totally different from the classical world. To address this issue Quantum Mechanics was developed in the 20th century by people like Max Planck, Albert Einstein, Niels Bohr, Werner Heisenberg, Erwin Schrodinger, Paul Dirac, Wolfgang Pauli, Richard Feynman, etc. The basis of the modern quantum theory is the Uncertainty principle proposed by Heisenberg and it is built on the concepts of probability.
Probability in today's World
The twentieth century saw the mathematical theory of probability develop leaps and bounds. One of the basic problems of probability is finding a formal unambiguous definition of the mathematical origin. The theory is so realistic, obvious, and application-based that it was really tough to find a theoretical definition of probability. The classical definition was initially formed which was far from being sufficient and made way for the frequency definition. Finally, the frequency definition was replaced by the axiomatic definition given by the Russian mathematician Andrey Kolmogorov in 1933. The axiomatic definition is based on three axioms which are logical and accepted worldwide. It settled the long-standing disputes between mathematicians regarding the definition of probability.
Probability and Statistics found a link and came together through the concept of hypothesis testing introduced by the Polish mathematician Jerzy Neyman and the British polymath R.A. Fisher. In modern times the concept of hypothesis testing is applied in various fields like biological and psychological experiments. It is used in the clinical trial of drugs and also in economics. Nowadays the idea of probability is used in concepts like the Markov process, Brownian motion, and other places where we have to deal with an aggregate of entities. Random fluctuations of stock markets are studied using probabilistic mathematical models to provide predictions for investors. So Mathematical finance has emerged as a new area of mathematics.
The modern era is an era of computer simulations, artificial intelligence, quantum computing, data science, etc. In almost all these areas the mathematical theory of probability plays a significant role. It is understandable that there is still a lot of room for development. Mathematicians all over the world work on stochastic models with the aim of improving the theory and increasing its applicability. We hope that a theory that developed from within the human society out of utmost necessity, will continue to develop and help humanity reach new heights in science and technology.
By
Prabir Rudra