I am in the process of understanding standard errors of regression coeficient with my knowledge of definitions of each terms.
- As I understand the definition of Standard Error is a measure of the statistical accuracy of an estimate, equal to the standard deviation of the theoretical distribution of a large population of such estimates.
- Now standard deviation is square root of variance. So if we get variance of coefficients, we will get standard error. Now Variance of Coefficients as I understand is V[b], where b is a matrix of all the estimated coefficients, where X is the Dependent Variable matrix including X0=1.
- But when I search for the equation for Var[b], I get a equation for Var[b] saying it is actually Var-Covariance matrix, and variance is found in diagnol of this matrix and standard error by taking square root of diagnol of this matrix.
- That puzzles me as if diagnol is variance of coefficients, then why variance-covariance matrix is defined as V[b]? I assume somewhere I lost in understanding the terms properly. Any help here? I am a novice in stat. Please help me with details.