基本信息

授课教师:qz

Lecture 10 概率论

实质内容可以不听回头自己看
有手有脑得个八九十分不是问题
来听就一定能听懂;不来听也能懂
作业 = 没有意义的东西

研究随机事件发生的可能性的应用数学

集合论 Set Theory

  • 并集 Union \cup
  • 交集 Intersection \cap
  • 补集 Complement ACA^{C}
  • 互斥 mutually Exclusive
  • 穷尽 Collectively Exhaustive
  • 分割 Partition

 

Applying Set Theory to Probability

  • 随机实验 Random Experience
  • 样本空间 Sample Space
  • 事件 Events

 

Probability Axioms

  • A1: P[A]0P[A]\ge0
  • A2: P[S]=1P[S]=1
  • A3: mutually exclusive events P[A1A2]=P[A1]+P[A2]+P[A_1\cup A_2\cup…]=P[A_1]+P[A_2]+…
  • P[AB]=P[A]+P[B]P[AB]P[A\cup B]=P[A]+P[B]-P[A\cap B]

 

Discrete Sample Space

S={a1,a2,an}S=\lbrace{a_1,a_2,…a_n}\rbrace

P[{ai}]=1/nP[\lbrace a_i\rbrace]=1/n

 

Conditonal Probability

Defination

P[AB]=P[AB]/P[B]P[A|B]=P[AB]/P[B]

 

Theorem

  • P[AB]>0P[A|B]>0
  • P[BB]=1P[B|B]=1
  • If AiA_i is the partition of A, then P[AB]=P[A1B]+P[A2B]+P[A|B]=P[A_1|B]+P[A_2|B]+…

 

Partitions & the Law of Total Probability

If the partition is B={B1,B2,,Bn}B = \{B_1,B_2,…,B_n\}, and Ci=ABiC_i=A\cap B_i, then A=C1C2CnA=C_1\cup C_2\cup …\cup C_n

Bayel’s Law

P[BA]=P[AB]P[B]P[A]P[B|A]=\displaystyle\frac{P[A|B]P[B]}{P[A]}

 

Independence

A and B are independent if only if P(AB)=P(A)P(B)    P(AB)=P(A),P(BA)=P(B)P(A\cap B)=P(A)P(B)\iff P(A|B)=P(A),P(B|A)=P(B)

Independence & Mutually Exclusive

independence ans mutually exclusive are not synonyms

only when P(A)P(B)=0P(A)P(B) = 0, Ind = M.E.

 

Random Variables

XSX\in S

SXS_X: random variable range

map the sample outcomes ss to the corresponding value of the random variable XX

 

Discrete Random Variables

Probablity Mass Function

Defination: PX(x)=P[X=x]P_X(x)= P[X=x]

 

Classical Distribution

Name Meaning PMF Expected Value Variance
Bernoulli(p) one test, result is 0, or 1 {1p,x=0p,x=1\begin{cases}1-p&,x=0\\p&,x=1\end{cases} pp p(1p)p(1-p)
Geometric(p) the number of tests that result occurs 1 time p(1p)x1,x=1,2,...p(1-p)^{x-1},x=1,2,... 1p\displaystyle\frac{1}{p} 1pp2\displaystyle\frac{1-p}{p^2}
Binomial(p) the number of result occurs in n times of tests (kn)pk(1p)nk,k=0,1,2,...\dbinom{k}{n}p^k(1-p)^{n-k},k=0,1,2,... npnp np(1p)np(1-p)
Pascal(k, p) the number of tests when the result occurs kk times (x1k1)pk(1p)xk\displaystyle\binom{x-1}{k-1}p^k(1-p)^{x-k} kp\displaystyle\frac{k}{p} k(1p)p2\displaystyle\frac{k(1-p)}{p^2}
Discrete Uniform(k, l) in range [k,l+1)[k,l+1), all events have equal probability 1(l+1)k\displaystyle\frac{1}{(l+1)-k} (l+1)+k12\displaystyle\frac{(l+1)+k-1}{2} ((l+1)k1)((l+1)k+1)12\displaystyle\frac{((l+1)-k-1)((l+1)-k+1)}{12}
Poisson(a) the number of events occuring in a fixed interval of time if each occurs with a known average rate aa and independently axeax!\displaystyle\frac{a^xe^{-a}}{x!} aa aa

Expected Value: E[X]=μX=xSXxPX(x)E[X]=\mu_X=\displaystyle\sum_{x\in S_X} xP_X(x)
Variance Value: Var[X]=E[(XμX)2]=E(X2)μX2Var[X]=E[(X-\mu_X)^2]=E(X^2)-\mu_X^2
Standard Deviation: σX=Var[X]\sigma_X=\sqrt{Var[X]}

Cumulative Distribution Function (CDF)

Defination: FX(x)=P[Xx]=xixP[X=xi]F_X(x)=P[X\le x]=\displaystyle\sum_{x_i\le x} P[X=x_i]

Derived Random Variable

Y=g(x),E[Y]=xSXg(x)PX(x)Y=g(x),E[Y]=\displaystyle\sum_{x\in S_X} g(x)P_X(x)

  • E[aX+b]=aE[X]+bE[aX+b]=aE[X]+b
  • Var[aX+b]=a2Var[X]Var[aX+b]=a^2Var[X]

Continuous Random Variables

CDF: FX(x)=P[Xx]F_X(x)=P[X\le x]

  • P[x1Xx2]=x1x2fX(x)dx=FX(x2)FX(x1)P[x_1\le X\le x_2]=\displaystyle\int_{x_1}^{x_2}f_X(x)\mathrm{d}x=F_X(x_2)-F_X(x_1)

PDF: fX(x)=dFX(x)dxf_X(x)=\displaystyle\frac{\mathrm{d}F_X(x)}{\mathrm{d}x}

  • +fX(x)dx=1\displaystyle\int_{-\infty}^{+\infty}f_X(x)\mathrm{d}x=1

Uniform Random Variables

X is a uniform (a, b), PDF: fX(x)=1baf_X(x)=\displaystyle\frac{1}{b-a}

CDF: FX(x)=(xa)/(bx),x(a,b)F_X(x)=(x-a)/(b-x),x\in(a,b)

E[X]=(a+b)/2E[X]=(a+b)/2

Var[X]=(ba)2/12Var[X]=(b-a)^2/12

Gaussian / Normal Random Variables

X is a Gaussian, PDF: fX(x)=12πσ2e(xμ)22σ2f_X(x)=\displaystyle\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(x-\mu)^2}{2\sigma^2}}

CDF: FX(x)=ϕ(xμσ)F_X(x)=\phi(\displaystyle\frac{x-\mu}{\sigma})

Deifine: ϕ(x)=12πxet22dt\phi(x)=\displaystyle\frac{1}{\sqrt{2\pi}}\int_{-\infty}^xe^{-\frac{t^2}{2}}\mathrm{d}t

E[X]=μE[X]=\mu

Var[X]=σ2Var[X]=\sigma^2

Standard Normal Random Variables

Gaussian Random Variables when μ=0,σ=1\mu=0,\sigma=1

X is a Standard Normal, PDF: fX(x)=12πex22f_X(x)=\displaystyle\frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}

CDF: FX(x)=ϕ(x)=12πxet22dtF_X(x)=\phi(x)=\displaystyle\frac{1}{\sqrt{2\pi}}\int_{-\infty}^xe^{-\frac{t^2}{2}}\mathrm{d}t

E[X]=0E[X]=0

Var[X]=1Var[X]=1

In Gaussian(μ\mu, σ\sigma), test x=x0x=x_0, in Standard Normal, x=(x0μ)/σx^\prime=(x_0-\mu)/\sigma

  • ϕ(z)+ϕ(z)=1\phi(z)+\phi(-z)=1

Binary Random Variables

Joint Probability Mass Function(PMF)

PX,Y(x,y)=P[X=x,Y=y]P_{X,Y}(x,y)=P[X=x,Y=y]

use table to present P(x, y)

Joint CDF

FX,Y(x,y)=P[Xx,Yy]F_{X,Y}(x,y)=P[X\le x,Y\le y]

Joint PDF

fX,Y(x,y)=2FX,Y(x,y)xyf_{X,Y}(x,y)=\displaystyle\frac{\partial^2F_{X,Y}(x,y)}{\partial x\partial y}

Marginal PMF

PX(x)=ySYPX,Y(x,y)P_X(x)=\displaystyle\sum_{y\in S_Y}P_{X,Y}(x,y)

PY(y)=xSXPX,Y(x,y)P_Y(y)=\displaystyle\sum_{x\in S_X}P_{X,Y}(x,y)

Marginal PDF

fX(x)=FX,Y(x,y)dyf_X(x)=\displaystyle\int_{-\infty}^{\infty}F_{X,Y}(x,y)\mathrm{d}y

Covariance

Cov[X,Y]=E[(XμX)(YμY)]Cov[X,Y]=E[(X-\mu_X)(Y-\mu_Y)]

  • Cov[X,Y]=E[XY]μxμyCov[X,Y]=E[X\cdot Y]-\mu_x\mu_y

If 2 variables tend to show

  • similar behaviour, cov is positive
  • opposite behaviour, cov is negative
  • uncorrelated behaviour, cov is zero

Correlation

ρ\rho

Independence

公告
Welcome to Vanadium's Blog!
ZJUer | Freshman | IS | 术力口 | 摸鱼 | OP