logo

Probability & Statistics

   

Added on  2023-04-20

8 Pages1323 Words339 Views
1
PROBABILITY & STATISTICS
Student’s Name:
University Affiliate:
Course:
Problem 6.1.7

2
L(X1,,.......,Xn)=(2pi σ2)-n/2exp(-1/2σ2)∑ni=1(xi-u)2)
=(2piσ2)-n/2exp(-n/2σ2(ẋ-u)2exp(-n-1/2σ2s2)
=Hence(ẋ,s2) is a sufficient statistic.
When μ is fixed at ẋ, we get,
L((ẋ,σ2)|x1,......,xn)=2 πσ2)-n/2exp(n-1/2σ2s2)
Maximized as a function of σ2
Therefore,
{ӘlnL((ẋ,σ2)|x)}/Әσ2=Ә/Әσ2(-n/2Inσ2-n-1/2σ2s2)
=-n/2σ2+(n-1/2σ4)s2
Problem 6.1.19
T(x) is a minimal sufficient statistic if and only if;
fθ(x)/fθ(x) is independent of θ ͢͢ T(x)=T(y) as a consequence of Fisher’s
factorization theorem.
Gamma fθ(x)=θα0/┌(α0)xα0-1eβx
fθ(Xn)/fθ(yn)=(∏ni=1θα0/┌(α0)xα0-1e-βxi)/(∏ni=1βα0/┌(α0)yα0-1e-βyi)
=(∏ni=1(xi)αo-1(∏ni=1yi)α0-1e∑xi/e∑yi
=(∏ni=1xi/∏ni=1yi)α-1e(∑yi-∑xi
=The minimum sufficient statistic for (α,θ)=(∏xi-∑ni=1xi
Problem 6.1.11
L(θ|X0)=θx0 =1
∫10L(θ|X0) =0
The likelihood is a constant, hence the integral of a constant is 0.
Problem 6.2.4
i) L(θ;x1,,......xn)=ln(∏nj=1exp(-θ2)1/xj!θ2xj
=- nj=1ln(exp(-θ2)

3
=∑nj=1[ln(e-x0)-ln(xj!) +ln(θxj)]
=∑nj=1[-θ-ln(xj!) +xjln(θ)]
=-nθ-∑nj=1ln(xj!) +ln(θ)∑nj=1xj
ii)
Invariance,
If θ^ is the MLE of θ2, then,
Implies that √θ^ is the MLE of θ
i.e θ^=(√θ^)2
Problem 6.2.5
a)
P(xn)=β/√(α){xα-1neβxn
Using the maximum likelihood estimate,
Log P(Xn)=αlogβ-log┌(α) +(α-1) logxn-Bxn
And the log-likelihood is;
L(x,α,β)=∑N-1n=0logP(xn)
From the arguments, maxα, βL(x;α,β)>α^,β^
(Ә/Әα)L(x;α,β)=0
L(x;α,β)=Nαlogβ-Nlog(α)+(α-1)∑N-1n=0logxn-β∑N-1n=0xn
Ә/Әα[Nαlogβ-Nlog[(α)+(α-1)∑N-1n=0logxn-β∑N-1n=0xn]=0
Nlog β-1/┌(α)┌(α)+∑N-1n=0logxn=0
Nα/β-∑N-1n=0xn=0
Β^=α^/(1/N∑N-1n=0xn
Β^=α/ẋ
b)

End of preview

Want to access all the pages? Upload your documents or become a member.

Related Documents
Mathematical Theory of Risk Assignment 1
|6
|677
|238

Statistical Inference | Assignment
|7
|647
|19

Questions and Answers of Data Science
|7
|688
|22

Statistics 630 - Assignment 7
|10
|757
|205