PCA is a descriptive analysis method that is used to analyse numeric questions (metrics). It converts a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. This technique allows you to reduce the number of correlated variables and decrease the redundant information. It provides a new representation in a new space, where the first axis maximises the amount of information that can be shown.
Data to process PCA uses a matrix N x p where:
At the intersection of the row i and the column j, we set the observation i to the question j.
To compare observations independently of scale and unit problems, we standardise and reduce the data matrix for each attribute. We can then create an average “imaginary” point from which we can measure the differences between individuals using Euclidean distance. The results of a PCA are usually discussed in terms of component scores, sometimes called factor scores (the transformed variable values corresponding to a particular data point), and loadings (the weight by which each standardised original variable should be multiplied to get the component score).
Doing the projection of each point on each axis, we get the coordinates of the points. We then calculate correlations of questions to axis. The result allows a simultaneous representation of interviews and questions:

Note that PCA is sensitive to the relative scaling of the original variables.
To perform a principal component analysis:
principal components.active tab (for example, drag and drop them from the questionnaire list).inactive tab.correlation matrix, eigen values, etc.).results to view the analysis.One page of results is generated for each calculation you selected. A summary of the calculations is as follows.
Correlation matrix: A correlation matrix describes correlation among p variables. It is a square symmetrical MxM matrix with the (ij)th element equal to the correlation coefficient rij between the (i)th and the (j)th variable. The diagonal elements (correlations of variables with themselves) are always equal to 1.00. The correlations value are always included between – 1.00 <r values < + 1.00. For example:

Eigen values: Eigen values (λ) are a special set of scalars associated with a linear system of equations. For example:

Information: The percentage supported by each axis, calculated as follows: (I=frac{lambda}{sum_{i=1}^{rho} lambda}). The first axis has always the higher percentage.
Cumulated information: The cumulated percentage of information.
Question/axis correlation: Describes the correlation between questions and axis. For example:

Interview coordinates: Gives the coordinates per interview on each axis. For example:

Interview representation quality: Gives the quality of representation of each interview on each axis. The sum per interview (percentage across) is equal to 100%. For example:

Interview contribution: Gives the contribution of the interview for one axis. The sum per axis (percentage down) is equal to 100%. For example:

You can change the specific settings used for the analysis. To do so, click analysis options.... For details, see analysis options.
It is possible to create a variable based on your analysis. To do so, click create variable.... For details, see creating a variable.