language-icon Old Web
English
Sign In

Independence (probability theory)

In probability theory, two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of occurrence of the other (equivalently, does not affect the odds). Similarly, two random variables are independent if the realization of one does not affect the probability distribution of the other. P ( A ∩ B ) = P ( A ) P ( B ) {displaystyle mathrm {P} (Acap B)=mathrm {P} (A)mathrm {P} (B)}     (Eq.1) P ( A m ∩ A k ) = P ( A m ) P ( A k ) {displaystyle mathrm {P} (A_{m}cap A_{k})=mathrm {P} (A_{m})mathrm {P} (A_{k})}     (Eq.2) P ( ⋂ i = 1 k B i ) = ∏ i = 1 k P ( B i ) {displaystyle mathrm {P} left(igcap _{i=1}^{k}B_{i} ight)=prod _{i=1}^{k}mathrm {P} (B_{i})}     (Eq.3) F X , Y ( x , y ) = F X ( x ) F Y ( y ) for all  x , y {displaystyle F_{X,Y}(x,y)=F_{X}(x)F_{Y}(y)quad { ext{for all }}x,y}     (Eq.4) F X 1 , … , X n ( x 1 , … , x n ) = F X 1 ( x 1 ) ⋅ … ⋅ F X n ( x n ) for all  x 1 , … , x n {displaystyle F_{X_{1},ldots ,X_{n}}(x_{1},ldots ,x_{n})=F_{X_{1}}(x_{1})cdot ldots cdot F_{X_{n}}(x_{n})quad { ext{for all }}x_{1},ldots ,x_{n}}     (Eq.5) F X , Y ( x , y ) = F X ( x ) ⋅ F Y ( y ) for all  x , y {displaystyle F_{mathbf {X,Y} }(mathbf {x,y} )=F_{mathbf {X} }(mathbf {x} )cdot F_{mathbf {Y} }(mathbf {y} )quad { ext{for all }}mathbf {x} ,mathbf {y} }     (Eq.6) F X t 1 , … , X t n ( x 1 , … , x n ) = F X t 1 ( x 1 ) ⋅ … ⋅ F X t n ( x n ) for all  x 1 , … , x n {displaystyle F_{X_{t_{1}},ldots ,X_{t_{n}}}(x_{1},ldots ,x_{n})=F_{X_{t_{1}}}(x_{1})cdot ldots cdot F_{X_{t_{n}}}(x_{n})quad { ext{for all }}x_{1},ldots ,x_{n}}     (Eq.7) F X t 1 , … , X t n , Y t 1 , … , Y t n ( x 1 , … , x n , y 1 , … , y n ) = F X t 1 , … , X t n ( x 1 , … , x n ) ⋅ F Y t 1 , … , Y t n ( y 1 , … , y n ) for all  x 1 , … , x n {displaystyle F_{X_{t_{1}},ldots ,X_{t_{n}},Y_{t_{1}},ldots ,Y_{t_{n}}}(x_{1},ldots ,x_{n},y_{1},ldots ,y_{n})=F_{X_{t_{1}},ldots ,X_{t_{n}}}(x_{1},ldots ,x_{n})cdot F_{Y_{t_{1}},ldots ,Y_{t_{n}}}(y_{1},ldots ,y_{n})quad { ext{for all }}x_{1},ldots ,x_{n}}     (Eq.8) In probability theory, two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of occurrence of the other (equivalently, does not affect the odds). Similarly, two random variables are independent if the realization of one does not affect the probability distribution of the other. The concept of independence extends to dealing with collections of more than two events or random variables, in which case the events are pairwise independent if each pair are independent of each other, and the events are mutually independent if each event is independent of each other combination of events. Two events A {displaystyle A} and B {displaystyle B} are independent (often written as A ⊥ B {displaystyle Aperp B} or A ⊥ ⊥ B {displaystyle Aperp !!!perp B} ) if and only if their joint probability equals the product of their probabilities::p. 29:p. 10 Why this defines independence is made clear by rewriting with conditional probabilities:

[ "Statistics", "Utility model" ]
Parent Topic
Child Topic
    No Parent Topic