Independent and identically distributed random variables

In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is usually abbreviated as i.i.d. or iid or IID. Herein, i.i.d. is used, because it is the most prevalent. F X ( x ) = F Y ( x ) ∀ x ∈ I F X , Y ( x , y ) = F X ( x ) ⋅ F Y ( y ) ∀ x , y ∈ I {displaystyle {egin{aligned}&F_{X}(x)=F_{Y}(x),&forall xin I\&F_{X,Y}(x,y)=F_{X}(x)cdot F_{Y}(y),&forall x,yin Iend{aligned}}}     (Eq.1) F X 1 ( x ) = F X k ( x ) ∀ k ∈ { 1 , … , n }  and  ∀ x ∈ I F X 1 , … , X n ( x 1 , … , x n ) = F X 1 ( x 1 ) ⋅ … ⋅ F X n ( x n ) ∀ x 1 , … , x n ∈ I {displaystyle {egin{aligned}&F_{X_{1}}(x)=F_{X_{k}}(x),&forall kin {1,ldots ,n}{ ext{ and }}forall xin I\&F_{X_{1},ldots ,X_{n}}(x_{1},ldots ,x_{n})=F_{X_{1}}(x_{1})cdot ldots cdot F_{X_{n}}(x_{n}),&forall x_{1},ldots ,x_{n}in Iend{aligned}}}     (Eq.2) In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is usually abbreviated as i.i.d. or iid or IID. Herein, i.i.d. is used, because it is the most prevalent. In statistics, it is commonly assumed that observations in a sample are effectively i.i.d. The assumption (or requirement) that observations be i.i.d. tends to simplify the underlying mathematics of many statistical methods (see mathematical statistics and statistical theory). In practical applications of statistical modeling, however, the assumption may or may not be realistic. To partially test how realistic the assumption is on a given data set, the autocorrelation can be computed, lag plots drawn or turning point test performed.The generalization of exchangeable random variables is often sufficient and more easily met. The i.i.d. assumption is important in the classical form of the central limit theorem, which states that the probability distribution of the sum (or average) of i.i.d. variables with finite variance approaches a normal distribution. Often the i.i.d. assumption arises in the context of sequences of random variables. Then 'independent and identically distributed' implies that an element in the sequence is independent of the random variables that came before it. In this way, an i.i.d. sequence is different from a Markov sequence, where the probability distribution for the nth random variable is a function of the previous random variable in the sequence (for a first order Markov sequence). An i.i.d. sequence does not imply the probabilities for all elements of the sample space or event space must be the same. For example, repeated throws of loaded dice will produce a sequence that is i.i.d., despite the outcomes being biased. Suppose that the random variables X {displaystyle X} and Y {displaystyle Y} are defined to assume values in I ⊆ R {displaystyle Isubseteq mathbb {R} } . Let F X ( x ) = P ⁡ ( X ≤ x ) {displaystyle F_{X}(x)=operatorname {P} (Xleq x)} and F Y ( y ) = P ⁡ ( Y ≤ y ) {displaystyle F_{Y}(y)=operatorname {P} (Yleq y)} be the cumulative distribution functions of X {displaystyle X} and Y {displaystyle Y} , respectively, and denote their joint cumulative distribution function by F X , Y ( x , y ) = P ⁡ ( X ≤ x ∧ Y ≤ y ) {displaystyle F_{X,Y}(x,y)=operatorname {P} (Xleq xland Yleq y)} . Two random variables X {displaystyle X} and Y {displaystyle Y} are identically distributed if and only if F X ( x ) = F Y ( x ) ∀ x ∈ I {displaystyle F_{X}(x)=F_{Y}(x),forall xin I} . Two random variables X {displaystyle X} and Y {displaystyle Y} are independent if and only if F X , Y ( x , y ) = F X ( x ) ⋅ F Y ( y ) ∀ x , y ∈ I {displaystyle F_{X,Y}(x,y)=F_{X}(x)cdot F_{Y}(y),forall x,yin I} . (See further Independence (probability theory) § Two random variables.) Two random variables X {displaystyle X} and Y {displaystyle Y} are i.i.d. if they are independent and identically distributed, i.e. if and only if The definition extends naturally to more than two random variables. We say that n {displaystyle n} random variables X 1 , … , X n {displaystyle X_{1},ldots ,X_{n}} are i.i.d. if they are independent (see further Independence (probability theory)#More than two random variables) and identically distributed, i.e. if and only if

[ "Random variable", "Sum of normally distributed random variables", "Algebra of random variables", "Maxwell's theorem", "Continuous mapping theorem", "Circular law" ]
Parent Topic
Child Topic
    No Parent Topic