Component loadings are the coordinates of variables onto the components - $a$'s shown on the left pic. The squared lengths of all the four vectors are their variances (the variance of a component is the aforementioned sum of its squared loadings). The direction of $P_1$ is such as to maximize the sum of the two squared loadings of this component and $P_2$, the remaining component, goes orthogonally to $P_1$ in plane X. The components are variables too, only mutually orthogonal (uncorrelated). Principal components $P_1$ and $P_2$ lie in the same space "plane X" spanned by the two variables. Variables $X_1$ and $X_2$ positively correlate: they have acute angle between them. On the pictures below the variables displayed are centered (no need for a constant arises). In a subject space, if variables have been centered, the cosine of the angle between their vectors is Pearson correlation between them, and the vectors' lengths squared are their variances. Variable points are connected with the origin and form vectors, arrows, spanning the subject space so here we are ( see also). Drawing the many axes is actually needless because the space has the number of non-redundant dimensions equal to the number of non-collinear variables. If you draw the opposite way - variables as points and individuals as axes - that will be a subject space. (For algebra/algorithm of canonical correlation analysis look in here.)ĭrawing individuals as points in a space where the axes are variables, a usual scatterplot, is a variable space. Below I've drawn pictures which might explain the essence and the differences in the three procedures, but even with these pictures - which are vector representations in the "subject space" - there are problems with capturing CCA adequately. The latter two are often explained and compared by means of a 2D or 3D data scatterplots, but I doubt if that is possible with CCA. Well, I think it is really difficult to present a visual explanation of Canonical correlation analysis (CCA) vis-a-vis Principal components analysis (PCA) or Linear regression.