Notes@HKU by Jax

Probabilities

Combinations

The formula for combinations, also known as the binomial coefficient, is given by:

(nr)=n!r!(nr)!\binom{n}{r} = \frac{n!}{r!(n-r)!}

Which represents the number of ways to choose r items from a set of n distinct items without regard to their order.

Pascal's rule

Pascal's rule is given by (n+1r)=(nr)+(nr1)\binom{n+1}{r}=\binom{n}{r}+\binom{n}{r-1}

Probability measures the likelihood of an event happening. It assigns a numerical value between 0 and 1 to an event, where 0 represents an impossible event and 1 indicates a certain event.

For example, the probability of flipping a fair coin and getting heads is 0.5, as there are two equally likely outcomes (heads or tails).

The key to understanding probabilities (or anything else) is practice.

Infinite sets

SetDescriptionExample
Z\mathbb{Z}Integers1,0,1-1,0,1
N\mathbb{N}Natural0,1,20,1,2
R\mathbb{R}RealAny number 47,0.1,1,π\frac{4}{7}, 0.1, 1, \pi
Q\mathbb{Q}RationalAny number that can be expressed as a fraction 227\frac{22}{7}
C\mathbb{C}Complexa+bia+bi

Venn diagrams

Venn diagrams illustrate concepts like intersections, unions, which is a good way to visualize probabilities. The overlapping regions indicate elements that belong to multiple sets, while non-overlapping regions represent elements unique to specific sets.

In this example, two circles are drawn to represent sets A and B. The overlapping region is labeled as the intersection of sets A and B, denoted by A \cap B.

Probability notation and event types

A and BA or BNot A
IntersectionUnionCompliment

Conditional probabilities

The probability of A given that B occurs: P(A  B)=P(AB)P(B)P(A\ \|\ B) = \frac{P(A\cap B)}{P(B)}

Independent events

A and B are said to be independent if P(A)×P(B)=P(AB)P(A) \times P(B) = P(A\cap B).

Being independent means that the probability of an event has no influence on the other

Random variables

Notation

Random variables are denoted with capital letters XX

The possible outcomes are denoted with regular letters xx

Probability that the outcome of XX is xx is denoted by P(X=x)P(X=x)

Expected value and Variance

E(Xn)=(xnP(X=x))E(X^n)=\sum(x^n\cdot P(X=x)) gives the expected value, which represents the mean value (outcome) of the random variable.

Var(X)=E(X2)E(X)2Var(X)=E(X^2)-E(X)^2 gives the variance, which is a measure of the variability of the random variable's outcomes.

Operations of E(X) and Var(X)

E(X+Y)=E(X)+E(Y)E(X+Y)=E(X)+E(Y), which any addition / subtraction function within E()E() can be expanded.

If XX and YY are independent:

Var(X+Y)=Var(X)+Var(Y),E(XY)=E(X)E(Y)Var(X+Y) = Var(X) + Var(Y),\quad\quad E(XY) = E(X)\cdot E(Y)

for Y=aX+b:for\ Y=aX+b: E(Y)=aE(X)+b,Var(Y)=a2Var(X)E(Y)=aE(X)+b,\quad\quad Var(Y)=a^2Var(X)

Joint random variables

Joint random variables are in the form P(X=x,Y=y)P(X=x, Y=y). We can visualize the joint distribution in the following way:

Y,XY, Xx1x_1x2x_2Sum
y1y_1P(X=x1,Y=y1)P(X=x_1, Y=y_1)P(X=x2,Y=y1)P(X=x_2, Y=y_1)P(Y=y1)P(Y=y_1)
y2y_2P(X=x1,Y=y2)P(X=x_1, Y=y_2)P(X=x2,Y=y2)P(X=x_2, Y=y_2)P(Y=y2)P(Y=y_2)
SumP(X=x1)P(X=x_1)P(X=x2)P(X=x_2)

Note that the sum of a column or row results in the corresponding variable's probability of outcome.

Expected value

E(X+Y)=((x+y)P(X=x,Y=y))E(X+Y) = \sum((x+y) P(X=x, Y=y))

E((XY)n)=((xy)nP(X=x,Y=y))E((XY)^n) = \sum((xy)^n P(X=x, Y=y))

Random variables of random variables (outcomes)

For random variables modelled in the following way:

Xˉ=X1+X2++Xn\bar{X}=X_1+X_2+\dots+X_n

We can deduce that:

E(Xˉ)=E(X1)++E(Xn)n=nE(X)nE(\bar{X})=\frac{E(X_1)+\dots+E(X_n)}{n}=\frac{nE(X)}{n} Var(Xˉ)=Var(X1)++Var(Xn)n2=nVar(X)n2Var(\bar{X})=\frac{Var(X_1)+\dots+Var(X_n)}{n^2}=\frac{nVar(X)}{n^2}

Expected value and Variance

E(Xˉ)=E(X)E(\bar{X}) = E(X)

Var(Xˉ)=Var(X)nVar(\bar{X}) = \frac{Var(X)}{n}

On this page