Probability and Statistics Assignment Solution - University

Verified

Added on  2022/09/26

|16
|367
|21
Homework Assignment
AI Summary
This document presents a comprehensive solution to a probability and statistics homework assignment. The solution covers a wide range of topics, including mutually exclusive events, indicator random variables, cumulative distribution functions, probability density functions, geometric and Poisson distributions, and maximum likelihood estimation. The assignment addresses concepts such as independence, joint probability, conditional probability, moment-generating functions, lognormal distributions, and method of moments estimation. The solutions involve detailed calculations and derivations, providing a thorough understanding of the statistical concepts presented in the assignment. The student has provided all the necessary steps to solve the problems. The document is intended to help students understand and solve similar problems.
Document Page
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
1.
a. P ( A ) = 1
4 =P ( B ) = 1
4 =P ( C ) = 1
4
Since P(A) and P(B) are mutually exclusive P ( A B ) =P ( A )+ P ( B )= 1
4 + 1
4 = 1
2
And since P(C) is independent of P(A) and P(B), and let P ( A B ) =P(D)
Therefore;
P ( D C )=P ( D C ) +P ( A ) + P ( B )
P ( DC )=P ¿
P ( DC ) = 1
2 . 1
4 + 1
4 + 1
4 =5
8
Hence
P ( A B C ) =5
8
b. Joint probability of X and Y
X = ( A B ) Y =( B C )
X =P ( A B ) =P ( A ) + P ( B ) = 1
4 + 1
4 = 1
2
¿ Y = ( B C ) =P ( B ) P ( C ) = 1
4 . 1
4 = 1
16
Joint Probability function;
P ( X ,Y )= 1
2 . 1
16 = 1
32
Document Page
c. Independence of X and Y
X and Y are independent because the occurrence of one event do not affect the occurrence of
another event, meaning they can occur at the same time.
2. CDF
F ( x )=
{ c1 ex if x< 0
c2 if 0 x< 1
c3ex otherwise
a.
f x=F' (x)=c1 ex=1 , x< 0
c1=1
f x=F' (x)=0=1 , 0 x <1
c2=0
c3=0
b. P(1< X 1
2 )
F ( x )=ex
¿
1
1
2
e x dx
¿ [ ex ]1
1
2
¿ e
1
2 e1
¿ 1.281
Document Page
c. P(x=1)
¿

1
ex dx
¿ [ ex ]
1
¿ e1e
¿ 2.718
d. E [min ( X , 1 ) ]
F ( x )=
{ c1 ex if x< 0
c2 if 0 x<1
c3ex otherwise
The expectation of the minimum variable of the cumulative distribution function is given by
F ( x ) =P ( X x ) =1P ( X >x )
¿ 1P( X > x , X 2 , . Xn > x)
By independence;
¿ P ( X1 > x ) P ( X2>x ) P( Xn> x)
¿ 1[P ( X1> x ) ]n
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
3.
a. PDF
f ( x ) =
0

k e2 x dx
1=
0

k e2 x dx
1= [ k e2 x ] 0

¿ k
Thus k=1
b. Probability
P(2< X< 5)=
2
5
e2 x dx
¿ [ e2 x ] 2
5
¿ [ e2 ×2 ] [ e2× 5 ]
¿ 0.0183
c. PDF
Y = X
FY ( y ) =P ( Y y ) =P ( X y ) =P ( X y2 )=FX ( y2 ) for X> 0
The derivative of the above equation will be;
Document Page
f Y ( y )= d
dy FY ( y ) = d
dy FY ( y2 )=f X ( y2 ) . f X ( y2 ) '
4.
a. i.i.d random variables
Y = Xi
E ( Y )= Xi P( X=x )
E ( Y )= k pk (1 p)
let q=1 p
E ( Y )=q k pk
E ( Y )=q
k 1

k pk
¿ qp
k 1
k pk1
¿ qp 1
( 1 p ) 2 = pq
p2
¿ q
p
Var ( Y )=E ( X2 ) [ E ( X ) ]2
¿ E [ X ( X 1 ) ]+ E ( X ) [ E ( X ) ]2
¿ 2(1 p)
p2 + 1
p + 1
p2
¿ 22 p+ p+1
p2 =1 p
p2 = q
p2
Document Page
b. P(Y=0)
P ( Y =0 ) =1P(Y 0)
¿ 1 p qk1
¿ 1qk1
c. Cov ( X , N )
Cov ( X , N ) =Var (X , N )
Var ( X , N ) =E [ X , N ]2 E ( X ) E(N )
Since X is a random variables while is a mere number of variables in our data set, there is no
covariance between X and N
5. Joint probability density function
f ( x , y ) =e y
a.
P ( Y 2 X ) =
0

f XY ( x , y ) dydx
¿
0
2 X
e y dydx
¿ [e y ]0
2 X
¿
0

1e2 X dx
¿ [ x+ 2 e2 X ]0
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
¿ 2
b. PDF of U
Here we are going to need to two random variables U and W, where we define W=X
Thus a function g is given by;
g= {U =X +Y
W =X
Inversing the function, we get;
g= { X=W
Y =U X
Using Jacobean, we get
|J|=
|det [ 0 1
1 1 ]|=|1|=1
Thus,
f UW ( u , w ) =f XY ( w , uw)
c. Correlation between X and Y
ρXY = σ XY
Var ( X ) Var (Y )
d. Conditional probability density
f ( x , y ) =e y
f ( x y )= f ( x , y )
f ( y ) , givenY = y
f ( x y )= e y
e y =1
6.
a. Moment Generating functions
Document Page
f X ( x ) =λ e λ x
M x ( t ) =E(etx )
E ( etx ) = etx f X ( x ) dx
¿etx λ e λ x dx
¿ λ ex (t λ) dx
¿ λ [ e x(t λ) ]0

¿ λ [ 0 1
tλ ]
¿ ʎ
tλ
b. Probability density function
FX ( x ) = { λ eλ x if x R
0 if x
f x=F X
' ( x )=λ eλx
c. Maximum likelihood estimation
f X ( x ) =λ e λ x
f ( x1 , x2 , , xn| λ )=Π λ eλ x
L ( x1 , x2 , , xn|λ ) =n λ e λ x
Getting the log,
logL ( x1 , x2, , xn|λ )=log (n λ e λ x )
Document Page
logL ( x1 , x2, , xn|λ )=n log λλ
i=1
n
xi
¿
d (n log λλ
i=1
n
xi)
d λ
¿ n
λ
i=1
n
xi
Therefore,
λ= n

i=1
n
xi
ʎ is therefore the maximum likelihood estimator of the function.
d. Normalizing constants
To normalize a function,
f X ( x ) =λ e λx



f X ( x ) =


λ e λx dx= 2 π
We therefore define a function;
φ ( x )= 1
2 π f X ( x )= 1
2 π λ eλx
Such that;



φ ( x )=


1
2 π λ eλx=1
Our normalizing factor is 1
2 π , while second normalizing factor becomes 2 π and;
1
2 π ( n

i=1
n
xi
2 π
) converges to normal random variable
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
7. Lognormal distribution
a.
Y i= 1
2 πσx exp ( ( ln ( x ) μ )2
2 σ2 )
T =i=1
n
Y i=i=1
n 1
2 π σx exp ( ( ln ( x )μ )2
2 σ2 )
T =
i=1
n
( ( 2 π σ2 )
1
2 x1 exp ( ( ln ( x )μ )2
2 σ2 ) )
T = ( 2 π σ2 )
n
2

i=1
n
x1 exp (
i=1
n
( ln ( x ) μ )
2
2 σ2 )
b.
E ( T )=eμ + σ2
2
Var ( T ) =e2 μ+ a2
(eσ2
1)
Document Page
c.
Successive geometric average converges to eμ + σ 2
2
Suppose x1 , . , xn is a sequence of i. i. d draws with E ( X ) =μVar ( x ) =σ < for all i. then for any ε >0 the sa
lim
n
P (|^xnμ|> ε )=0
Then ^xn converges in probability μ
In our situation,
lim
n (
i=1
n
Y i )=¿ lim
n
P (|^xnμ|> ε )=0 ¿
Y i coverges probability ¿ eμ+ σ 2
2
d. Maximum Likelihood Estimator
T = ( 2 π σ2 )
n
2

i=1
n
x1 exp (
i=1
n
( ln ( x ) μ )
2
2 σ2 )
L ( T ) =ln ( ( 2 π σ2 )
n
2

i=1
n
x1 exp (
i=1
n
( ln ( x ) μ )2
2σ 2 ) )
¿ n
2 ln ( 2 π σ2 )
i=1
n
ln ( xi )

i=1
n
( ln ( x )μ )2
2 σ2
¿ n
2 ln ( 2 π σ2 )
i=1
n
ln ( xi )

i=1
n
[ln ( xi )22 ln ( xi ) μ+μ¿ ¿ 2]
2σ 2 ¿
chevron_up_icon
1 out of 16
circle_padding
hide_on_mobile
zoom_out_icon
[object Object]