Bayes' Theorem
Bayes' theorem provides a mathematical framework for updating the probability of a hypothesis as new evidence becomes available, making it central to both quant interviews and trading decision-making.
Conditional probability, written P(A|B), is the probability of event A occurring given that event B has already occurred. It is calculated as P(A|B) = P(A and B) / P(B). Conditional probability is fundamental to Bayes' theorem, Markov chains, and virtually all probabilistic reasoning in quantitative finance. It is one of the most frequently tested topics in quant trading interviews.
Conditional probability answers the question: "Given that I know something has happened, how does that change the probability of something else?"
Formally, the conditional probability of event A given event B is:
P(A|B) = P(A ∩ B) / P(B)
The vertical bar "|" is read as "given." P(A|B) means "the probability of A, given that B has occurred."
Conditional probability is the mathematical way to update beliefs based on new information β a concept that arises constantly in trading. When a stock's price drops 5%, what is the probability it drops another 5%? When an economic report surprises to the upside, what is the probability the Fed raises rates? These are all conditional probability questions.
Conditional probability is also the foundation for Bayes' theorem, Markov chains, and virtually all statistical inference β making it one of the most fundamental concepts in quantitative finance.
Several important concepts build on conditional probability:
Independence: Events A and B are independent if knowing B doesn't change the probability of A: P(A|B) = P(A). Equivalently, P(A ∩ B) = P(A) × P(B). For example, consecutive coin flips are independent β knowing the first flip was heads doesn't affect the probability of the second flip.
Multiplication rule: P(A ∩ B) = P(A|B) × P(B). This lets you compute joint probabilities from conditional probabilities.
Chain rule: For multiple events: P(A ∩ B ∩ C) = P(A|B ∩ C) × P(B|C) × P(C). This extends to any number of events.
Law of total probability: If B1, B2, ..., Bn form a partition of the sample space, then:
P(A) = ∑ P(A|Bi) × P(Bi)
This is essential for computing probabilities by conditioning on all possible scenarios. For example: "What is the probability of a profitable trade?" = P(profit|bull market) × P(bull market) + P(profit|bear market) × P(bear market).
Mock interviews, resume guides, and 500+ practice questions β straight to your inbox.
Example 1 β Card drawing:
You draw two cards from a standard deck without replacement. What is the probability both are aces?
P(1st ace) = 4/52 = 1/13
P(2nd ace | 1st ace) = 3/51 = 1/17
P(both aces) = (4/52) × (3/51) = 12/2652 = 1/221 ≈ 0.45%
Example 2 β Interview question (classic):
I roll a fair die and tell you the result is even. What is the probability it's a 6?
P(6 | even) = P(6 ∩ even) / P(even) = (1/6) / (3/6) = 1/3
Without the information, P(6) = 1/6. With the information that the roll is even, the sample space shrinks from {1,2,3,4,5,6} to {2,4,6}, and 6 is one of three equally likely outcomes.
Example 3 β Trading context:
Historical data shows: P(stock up on day 2 | stock up on day 1) = 0.53 and P(stock up on day 2 | stock down on day 1) = 0.49. If P(stock up on day 1) = 0.52, what is P(stock up on day 2)?
By the law of total probability:
P(up on day 2) = P(up | up on day 1) × P(up on day 1) + P(up | down on day 1) × P(down on day 1)
= 0.53 × 0.52 + 0.49 × 0.48 = 0.2756 + 0.2352 = 0.5108
Want personalized guidance from a quant?
Speak with a quant trader or researcher whoβs worked at a top firm.
Book a Free ConsultConditional probability is woven into every aspect of quantitative trading:
Conditional probability: the probability of A given B equals the probability of both A and B divided by the probability of B.
Law of total probability: the unconditional probability of A is the weighted sum of conditional probabilities across all possible scenarios B_i.
Conditional probability is the most tested topic in quant interviews. Every interview at Jane Street, SIG, Optiver, and Citadel includes conditional probability problems. Mastering the definition, independence, the law of total probability, and Bayes' theorem is non-negotiable for quant trading roles.
Practice with our Jane Street interview questions. Book a free consultation for personalized probability prep.
Bayes' theorem provides a mathematical framework for updating the probability of a hypothesis as new evidence becomes available, making it central to both quant interviews and trading decision-making.
Expected value is the probability-weighted average of all possible outcomes of a random variable, forming the mathematical foundation for every rational trading and betting decision.
Probability questions are the cornerstone of quant trading interviews. This guide covers common question types, worked examples with solutions, and a study strategy to prepare effectively.
A Markov chain is a stochastic process where the probability of transitioning to the next state depends only on the current state, not on the history β the 'memoryless' property.
Unconditional (marginal) probability P(A) is the probability of A without any additional information. Conditional probability P(A|B) is the probability of A given that you know B has occurred. Conditional probability is always relative to some known information. For example, P(rain) = 30% (unconditional), but P(rain | dark clouds) = 70% (conditional on observing dark clouds).
Events A and B are independent if P(A|B) = P(A), or equivalently, P(A β© B) = P(A) Γ P(B). In practice, independence is often assumed based on physical reasoning (e.g., separate coin flips) or tested statistically (e.g., checking if stock returns on consecutive days are correlated). True independence is rare in financial markets β most assets are at least weakly correlated.
The most common mistake is confusing P(A|B) with P(B|A). These are generally different quantities. P(it rained | ground is wet) is not the same as P(ground is wet | it rained). This confusion leads to the 'prosecutor's fallacy' and errors in medical testing interpretation. Bayes' theorem provides the correct way to convert between the two.
Conditional probability is the foundation of many ML algorithms. Naive Bayes classifiers use P(class | features) via Bayes' theorem. Hidden Markov Models use conditional probabilities for sequence modeling. Logistic regression models P(Y=1 | X). In quant finance ML, conditional probability models predict returns conditional on features (trading signals).
Our bootcamp covers probability, statistics, trading intuition, and 500+ real interview questions from top quant firms.
Book a Free Consult