Covariance matrix - Wikipedia

#artificialintelligence 

In probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of a given random vector. Any covariance matrix is symmetric and positive semi-definite and its main diagonal contains variances (i.e., the covariance of each element with itself). As an example, the variation in a collection of random points in two-dimensional space cannot be characterized fully by a single number, nor would the variances in the x {\displaystyle x} and y {\displaystyle y} directions contain all of the necessary information; a 2 2 {\displaystyle 2\times 2} matrix would be necessary to fully characterize the two-dimensional variation. Some statisticians, following the probabilist William Feller in his two-volume book An Introduction to Probability Theory and Its Applications,[2] call the matrix K X X {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {X} }} the variance of the random vector X {\displaystyle \mathbf {X} }, because it is the natural generalization to higher dimensions of the 1-dimensional variance. Others call it the covariance matrix, because it is the matrix of covariances between the scalar components of the vector X {\displaystyle \mathbf {X} } .

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found