Bayesian Statistics: Negative Binomial Regression Model and Proof

Verified

Added on  2023/05/28

|3
|569
|273
Homework Assignment
AI Summary
This assignment focuses on Bayesian statistics within the context of a negative binomial regression model. It explains an algorithm for extracting parameters β=(β1, β2, β3) and κ, detailing the probability distribution of a zero-truncated negative binomial probability. The assignment calculates the Maximum Likelihood Estimation (MLE) of μi and provides a proof relating Poisson and Gamma distributions, specifically y i|λi ~ Poisson(λi), λi|xi ~ Gamma(κ, κμi), and 2log(μi) = β1 + β2xi + β3xi - λ. The document includes mathematical formulations and steps for deriving key results related to the regression model.
Document Page
BAYESIAN STATISTICS
Student Name
Class name
Date
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
Assignment – Bayesian statistics
Assume a negative binomial regression model as follows:
1. Explain an algorithm for extracting of β= ( β1 , β2 , β3 ) κ.
To obtain the algorithm of extracting β= ( β1 , β2 , β3 ) κ, we first we write down the
probability distribution of a zero-truncated negative binomial probability,
Pr( yi= j) = {π + ( 1π ) g ( y1=0 ) if j=0
( 1π ) g ( y1 ) if j>0
Where
g( yi) = Pr(Y= yi¿ μi, α ) = г ( yi+ α1)
г (α1) г ( yi +1) ( 1
1+α μi )
α1
( α μi
1+α μi )yi
Negative binomial component includes t (exposure time) and k regressor values for Xi .
Thus,
μi= exp { ln ( ti ) + β1 x1 i + β2 x2 i+ .+ βk xki }
Then calculate MLE of μi
ln( μi) = ln(exp{ ln ( ti ) + β1 x1 i +β2 x2 i+ .+ βk xki }) (taking natural log on both sides)
But ln(exp(ln(t)) is a constant
Therefore,
ln(exp{ ln ( ti ) + β1 x1 i + β2 x2 i+ .+ βk xki }) = β1 x1 i+ β2 x2i + .+ βk xki
Hence ( β1 , β2, β3)
2. Proof
yi |λi Poisson ( λi ) , λi|xi Gamma ( κ ,κ μi ) ,
Document Page
log ( μi ) =β1+ β2 xi + β3 xi
2
The pdf of P(X=x) = e λ

k=0
n λk
k !
(a,b) =
b

tα1 et dt (where is (a,b) denotes incomplete gamma function )
Then
(n,b) = (n-1)!eb

k=0
n1 bk
k ! given that n>0
P(X=x) = 1
(β)
0
λ
tα1 et dt
But
μi= exp { ln ( ti ) + β1 x1 i + β2 x2 i+ .+ βk xki }
Then calculate MLE of μi
ln(μi) = ln(exp{ ln ( ti ) + β1 x1 i +β2 x2 i+ .+ βk xki }) (taking natural log on both sides)
But ln(exp(ln(t)) is a constant
Therefore,
ln(exp{ ln ( ti ) + β1 x1 i + β2 x2 i+ .+ βk xki }) = β1 x1 i+ β2 x2i + .+ βk xki
chevron_up_icon
1 out of 3
circle_padding
hide_on_mobile
zoom_out_icon
[object Object]