STAT330 Lecture Notes - Likelihood-Ratio Test, Bias Of An Estimator, Delta Method
Document Summary
Last lecture: newton-raphson algorithm, fisher information: j( , fisher-scoring algorithm. S( ) = 15/ 5, i( ) = 15/ 2, j( ) = 5/ . Suppose (0) = 5, nd updated values of after 3 steps, (3), by using two algorithms. Before we introduce the nice properties of ml estimator, we need to de ne a concept. Unbiased estimator if e(t ) = ( ) for all , then the statistic t is an unbiased estimator of ( ). If not, t is a biased estimator of ( ) and e(t ) ( ) is called the bias. , xn f (x; ), then fisher information. J( ) = ne d2 log f (x1; ) d 2. = nj1( ) with j1( ) = e d2 log f (x1; ) d 2. Cramer-rao (cr)-lower bound for the variance of the unbiased estimator. If t is a unbiased estimator of ( ), then.