a;jfd;ajsfj;asjdjfa;sdj done: II. Sample Space(S), Simple event if - TopicsExpress



          

a;jfd;ajsfj;asjdjfa;sdj done: II. Sample Space(S), Simple event if one outcome, compound if more than one compound. Complement of event A is A. Union denoted AUB, read “A or B”. Intersection denoted A∩B. Read “A and B”. When A∩B=0 A and B considered mutually exclusive or “disjoint” events. P(S)=1=P (A) + P(B). EX) Battery testing: simple events E1=(s), E2=(fs) E3=(ffs) where f=fail and s=success. If probability of particular battery being satisfactory is .99 then P(S)=P(E1)+P(E2)+P(E3)... =.99(1+.01+.01^2 + .01^3...) ie a geometric series. For any probability of an event A there is the complement which is the probability it wont happen A, so P(A)= 1-P(A). For any event A, P(A)≤1. Calculating probability of two events: P(AUB)=P(A)+P(B)-P( A∩B). For three events P(AUBUC)= P(A)+P(B)+P(C)-P( A∩B)-P( A∩C)-P( B∩B)-P( A∩B ∩C) Calculating outcomes based on equally likely scenarios. P(A)=N(A)/N. EX) have 6 mystery books and 6 syfy books, first three each hardcover, other 3 paperback. Total possible outcomes=36, selecting one of each syfy and mystery paperbacks is 9. So P(A)= 9/36=.25 Product rule: If a single object N can be chosen in N1 ways and object M1 can be chosen based on N1 in N2 ways, then the number of pairs is N1*N2. Permutations: ordered subset, denoted P_k,n. Combinations: unordered subset, denoted as(n over k) read as “n choose k”. P_k,n= n!/(n-k)! where n is the group size and k is the subset size, yielding P_4,10=10!/(10-4)!=10!/6!=10*9*8*7=5040. Combination ex) (n choose k)= P_k,n/k!= n!/(k!(n-k)!) Conditional Probability: Probability of A given B P(AlB)=P( A∩B)/P(B). Multiplication Rule: P( A∩B)= P(AlB)*P(B) Bayes Theorem: A1:Ak mutually exclusive and exhaustive events, P(B)=P(BlA1)*P(A1)+....P(BlAk)*P(Ak) Posterior Probaility: P(AjlB)=P( Aj∩B)/P(B)= P(BlAj)P(Aj)/riemann sum(P(BlAi)*P(Ai) where j=1 through k Independence: P(AlB)=P(A). P(BlA)=P(AlB)P(B)/P(A). if A and B independent P( A∩B)= P(A)*P(B). mutually independent if P( A_i1∩A_i2∩...∩A_ik)=P(A_i1)*P(A_i2)*....P(A_ik) III. Random Variable, rv. A function with Domain=S and range is all real numbers. Bernoulli Random Variable: any rv whose possible values are 0 and 1. Discrete Random Variable: variable whose possible values constitute a finite set or be listed in an infinite sequence(countably infinite). Continuous Random Variable: values consist of all numbers on single number line or all numbers in mutually exlusive/disjoint union of such intervals ie [0,10]U[20,30] and no possible value of variable has positive probability. P(X=c)=0. Probability Distribution aka Probability Mass Function(PMF) of discrete rv defined for every # x by p(x) = P(X=x)= P(all s w/in S: X(s)=x). ie for every possible value x of rv, pmf specifies probability of observing value when experiment is performed. Conditions p(x)≥0 and sum of all possibilities x P(x)=1 in general format x 1 2 3 4 5 6 Cumulative Distribution Function(cdf) is F(x)= P(X≤x)= Riemannsum_y:y≤x p(y) for any #x, F(x) is prob. X at most x. p(x) .1 .1 .1 .1 .1 .5 cdf example similar to pmf example just replace x with y. For any 2 numbers a and b with a≤b P(a≤X≤b)= F(b)-F(a-) where “a-” represents largest possible X value strictly less than a. Expected value(aka mean value) of X denoted E(X) or μ_x or just μ is E(X)= μ_x= Riemannsum xD of x*p(x) Rules of expected value: E(aX+b)=a*E(X)+b aka a* μ_x +b. Variance: V(X)= riemannsum over D of(x- μ)^2 * p(x)= E(X- μ)^2 Standard Deviation: σ_x=sqrt(σ^2_x) V(X)=σ^2=[riemannsum over D of x^2*p(x)]- μ^2= E(X^2)-[E(X)]^2 Binomial Experiment: consists of sequence of n smaller experiments where n is fixed, each trial result in one of same two possible outcomes, trials independent, probs of success P(s) constant from trial to trial; denoted as p. Binomial Random Variable X: where X= # of successes among the n trials. Theorem: b(x;n,p)= ((n choose x)p^x (1-p)^(n-x)) where x=0:n. 0 otherwise. CDF of binomial rv is B(x;n,p)=P(X≤x)= riemannsum from y=0 and x of b(y;n,p) x=0:n. If X~Bin(n,p) then E(X)=np, V(X)=np(1-p)=npq, and σ_x= sqrt(npq) where q=1-p. Hypergeometric Distribution: P(X=x)=h(x;n,M,N)=(M choose x)((N choose n)-(M choose x))/(N choose n) for x as an integer, satisfying max of (0, n-N+M)≤x≤min(n,M) For a HD, assume pop. Or set sampled consists of N individuals/objects. Each individual characterized as success or failure, and there are M successes. A sample of n individuals selected w/o replacement in such a way each subset of size n is equally likely to be chosen. HD E(X)=n*(M/N) V(X)= ((N-n)/(N-1))*n*(M/N)*(1-M/N) Negative Binomial Distribution: based on experiment satisfying the following: consists of a sequence of independent trials, each with either a success or failure result, prob. Of success consistant from trial to trial, so P(S on trial i)=p for i=1,2,3,... and experiment continues until a total of r successes observed, with r a specified positive integer. Pmf of NB rv X w/ parameters r=numer successes and p=P(success) nb(x;r,p)= (x+r-1 choose r-1)p^r *(1-p)^x where x=0,1,2... E(X)=r(1-p)/p V(X)=r(1-p)/(p^2) Poisson Distribution: p(x;μ)=(e^(-μ)* μ^x)/x! Where x=0,1,2,3,... A discrete rv X has this distribution with parameter μ( μ>0) if it has the previous pmf. In poisson distribution mean is equal to variance where E(X)=V(X)= μ Poisson process: applied to instances of events that occur over time. Make assumption of: There exists parameter alpha>0 such that delta t, prob. Exactly one event occurs is alpha*deltat + o(deltat). Prob of 1+ events occuring during deltat is o(deltat). #events occuring during deltat is independent of number that occur prior to time interval. Pk(t)=e^(-alpha*t)*(alpha*t)^k /k! IIII. Probability Density Functions: PDF: of X is a function f(x) such that for any 2 numbers a and b w/ a≤X≤b so P( a≤X≤b)=∫from a to b of f(x)dx. Ie the probability that X takes on a value from a:b is the area above this interval and under the graph of the density funciton. A continuous rv X is said to have uniform distribution on interval [A,B] if pdf of X is f(x;A,B)= (1/(B-A)) from A≤x≤B and is 0 otherwise. Cumulative Distribution Function: for a continous rv X defined for every #x by F(x)= P(X≤x)=∫from -∞ to x of f(y)dy. Using F(x) to compute probabilities. Let X be continuous rv with pdf f(x) and cdf F(x). then for any number a, P(X>a)=1-F(a) and for any two numbers a and b with a0) if pdf of X is f(x;alpha,beta)= (alpha/beta^alpha)*x^(alpha-1)*e^-((x/beta)^alpha) Lognormal distribution:rv Y=ln(X) has normal distribution. Resulting pdf of lognormal rv with parameters μ and sigma is f(x; μ,sigma)= 1/(sqrt(2pi*sigma*x))*e^-((ln(x)- μ)- μ)^2/(2sigma^2)) where x>=0. E(X)=e^( μ+sigma^2/2) V(X)=e^2 μ+sigma^2 * (e^(sigma^2) – 1). Beta distribution: parameters alpha, beta, A, and B if pdf of X is f(x;alpha,beta,A,B)= (1/(B-A))*((Gamma(alpha+beta)/(Gamma(alpha)*Gamma(Beta))*((x-A)/(B-A))^(alpha-1)*((B-x)/(B-A))^(beta-1))) where A
Posted on: Wed, 15 Oct 2014 02:45:58 +0000

Trending Topics




© 2015