Deciphering Uncertainty: Jacob Bernoulli’s Legacy in Probability Theory

Mario J. Pinheiro
7 min readMar 7, 2024

--

In this series, our primary aim is to demystify and popularize intricate mathematical structures, making them accessible to the everyday individual, both aspirational learners and mindful engineers. Even though mathematics is seen as mysterious and abstract, it is full of deep symmetry, profound truths, and beautiful things. Gaining mathematics literacy improves our ability to solve problems and creates new opportunities across many industries. A mathematically literate society is more powerful and capable of making the most use of equations and numbers to improve living standards and get a better understanding of the universe.

The 17th-century Swiss mathematician Jacob Bernoulli is commonly referred to as the “Father of Probability” because of the fundamental contributions he made to the science of probability and to mathematics as a whole. The moniker “Father of Uncertainty Quantification” may not be used often, but it accurately captures Bernoulli’s contribution to the creation of mathematical instruments for managing and comprehending uncertainty.

Contributions to Uncertainty Quantification:

  1. Law of Large Numbers: In his posthumously released book “Ars Conjectandi” in 1713, Jacob Bernoulli made the most important contribution to the area of probability and uncertainty quantification: he formulated the Law of Large Numbers. This theorem ensures that, as the number of trials rises, the average of the outcomes becomes closer to the predicted value, hence lowering uncertainty in predictions. It is fundamental to understanding the behaviour of averages of random variables.
  2. Binary Systems and Bernoulli Trials: He investigated binary systems, or what are now known as Bernoulli trials, in which there are only two potential outcomes (such as tossing a coin). The foundation for comprehending random processes and estimating the probability of various outcomes was established by this study, which is essential for calculating uncertainty in systems with probabilistic rather than deterministic outcomes.
  3. Combination Theory: Combination theory, which is essential for computing probabilities and comprehending the distribution of outcomes in complicated systems, saw major advancements under Bernoulli’s leadership. This work is crucial for industries like banking, insurance, and engineering that deal with risk and uncertainty.
  4. Risk Management: Bernoulli’s contributions to probability theory gave rise to the instruments that actuarial science and risk management use today. His research makes it possible to evaluate and quantify risks in ambiguous situations, which improves decision-making in these situations.
  5. Psychology of Decision Making: In addition, Bernoulli discussed the psychological implications of making decisions in the face of uncertainty, which paved the way for the eventual exploration of the Bernoulli principle, commonly known as utility theory. This principle introduces the idea of individual risk tolerance into the measurement of uncertainty by recommending that decision-makers compare the projected utility values of hazardous and uncertain prospects in order to make their selection.

Impact on Uncertainty Quantification:

Numerous fields that deal with uncertainty have been profoundly and permanently impacted by the work of Jacob Bernoulli on probability theory. His mathematical formulas establish the foundation for contemporary statistics, risk assessment, and decision sciences by offering a means of modelling, analysing, and making predictions about systems with uncertain outcomes. A key figure in the history of uncertainty quantification, Bernoulli’s concepts have enabled scientists, engineers, economists, and decision-makers to use a methodical and scientific approach to managing the unknown.

The Law of Large Numbers is Jacob Bernoulli’s most famous contribution to the understanding and measurement of uncertainty, even though he did not formulate it as an equation like we would do today. Rather, his work established the basic ideas that later mathematicians would formalise into mathematical equations. The Law of Large Numbers essentially deals with the outcomes of conducting an experiment many times. It states that the average of the results from many trials should be near to the expected value, and that the average of the results should tend to approach the expected value as more trials are conducted.

Law of Large Numbers (Conceptual Explanation)

While Bernoulli did not provide a modern equation for this law, the essence of his discovery can be expressed in a simplified modern notation as follows: Given a large number of trials n, the average result

​ of these trials for a random variable X with expected value $latex E(X)=μ$, the Law of Large Numbers tells us that:

Examples Illustrating Bernoulli’s Concept of Uncertainty

Example 1: Coin Tossing

Consider the act of tossing a fair coin. Each toss has two possible outcomes: heads or tails. If we define success as landing on heads, the expected probability of success p in each trial is 0.5.

If you toss the coin a large number of times, say n=1000, the Law of Large Numbers suggests that the proportion of heads (

​) will be very close to 0.5. The more you toss the coin, the closer the proportion will get to the expected value of 0.5.

Example 2: Rolling a Die

Imagine rolling a fair six-sided die. The expected value E( X) of the outcome is 3.5, since each side has an equal probability of landing face up, and the average (mean) of all possible outcomes (1 through 6) is 3.5.

If you roll the die a large number of times, the average value of the results

should approach 3.5, demonstrating the Law of Large Numbers. For instance, if n=10000 rolls, the average result should be very close to 3.5.

Bernoulli’s Trials

Another significant concept introduced by Bernoulli is the Bernoulli trial, which is a random experiment where there are only two possible outcomes (e.g., success or failure). The probability of success in each trial is p and the probability of failure is (1− p). Though Bernoulli’s Trials themselves do not directly describe uncertainty, they form the basis of many statistical methods for dealing with probabilistic events.

Bernoulli’s Equation for a Single Trial’s Expected Value and Variance

For a Bernoulli trial with success probability p, the expected value E(X) and variance Var(X) are given by:

  • Var(X)=p (1−p)

These fundamental concepts and the mathematical framework Bernoulli developed form the basis of probability theory and the quantification of uncertainty, influencing countless applications in science, engineering, economics, and decision theory.

Application in the Stock Market:

Expected Return and Variance in Portfolio Theory:

Modern portfolio theory, which evaluates how investors may optimise their portfolios through diversification to maximise anticipated return for a given level of risk, is based on Bernoulli’s insights into probability and risk.

Expected Return of a Portfolio: The computation of a portfolio’s anticipated return involves averaging the projected returns of each individual item in the portfolio, with the weights indicating the relative importance of each asset:

.

Portfolio Variance: In addition to accounting for the variances of individual assets, the variance of the portfolio’s return, which serves as a gauge of risk, also considers the covariance between pairs of assets:

,

where

is the covariance between the returns of assets i and j;

is the expected return of the portfolio;

​ is the weight of the ith asset in the portfolio;

is the expected return of the ith asset; n is the number of assets in the portfolio.

By utilising Bernoulli’s Law of Large Numbers and his seminal work on probability, investors may better comprehend and control the inherent unpredictability of financial markets.

Application in Engineering:

Reliability Engineering and Risk Assessment:

In engineering, failure rates and component dependability are predicted using statistical techniques and the Law of Large Numbers.

  • Failure Rate (λ) Calculation: In reliability engineering, the failure rate (λ) of a component is often estimated from historical failure data,
  • System Reliability: For a series system of n independent components, the system reliability (Rs​) can be calculated as the product of the reliabilities of individual components:
  • ,​where Ri​ is the reliability of component i.
  • Application in Quality Control (Six Sigma):The Six Sigma methodology in quality control uses statistical methods to reduce defects and variability in manufacturing processes. It aims to have processes operate within a certain range of standard deviations (σ) from the mean to minimize the defect rate. The process capability index (Cp​) is a measure used in Six Sigma to quantify how well a process fits within its specification limits:
  • :
  • σ is the standard deviation of the process output.

These instances highlight how Bernoulli’s theories of probability and statistical analysis are applied directly to the engineering and stock market domains, managing uncertainty and maximising results.

Originally published at http://science2be.wordpress.com on March 7, 2024.

--

--

Mario J. Pinheiro
Mario J. Pinheiro

Written by Mario J. Pinheiro

Seeking Wisdom from the Depths of Physics, Econophysics, and Martial Arts. Full Member of Sigma Xi, The Scientific Research Honor Society

No responses yet