STAT312 Lecture : Least squares; eigenvalues and eigenvectors.pdf
Document Summary
Another application of the qr decomposition is in regression. X = q1r1, where q1 : has orthogonal columns (q01q1 = i ), and r1 : is upper triangular and non-singular (and x0x = r01r1). (x) , a. Apply gram-schmidt once again to independent columns of basis for which is the. I h, to obtain q2 : ( ) whose columns are orthogonal to each other (q02q2 = i. Then q = (q1 orthogonal columns and is square, hence is an or- thogonal matrix (qq0 = q0q = i ). Least squares estimation in terms of hat matrix decomposition of norm of residuals: note that y kx + yk2 = kxk2 + kyk2. 2 k(i h) yk2 with equality i hy = x i ( if and only if ) Y = x = hy and are orthogonal to the residuals e = y.