Skip to content
mikerabat edited this page Mar 21, 2018 · 1 revision

Partial Least Squares

Partial least squares is closely related to PCA but with the difference that the resulting vectors are not aiming in the maximum variance but rather to the best fitting (in a least squares sense) regression curve to the second input matrix y. This method also reduces the dimension of x in the resulting projection matrix.

Definition in Wikipedia:

Partial least squares regression (PLS regression) is a statistical method that bears some relation to principal components regression; instead of finding hyperplanes of maximum variance between the response and independent variables, it finds a linear regression model by projecting the predicted variables and the observable variables to a new space. Because both the X and Y data are projected to new spaces, the PLS family of methods are known as bilinear factor models. Partial least squares Discriminant Analysis (PLS-DA) is a variant used when the Y is categorical. PLS is used to find the fundamental relations between two matrices (X and Y), i.e. a latent variable approach to modeling the covariance structures in these two spaces. A PLS model will try to find the multidimensional direction in the X space that explains the maximum multidimensional variance direction in the Y space. PLS regression is particularly suited when the matrix of predictors has more variables than observations, and when there is multicollinearity among X values. By contrast, standard regression will fail in these cases (unless it is regularized).

Interface Descrition

   procedure Plsr(Xreg, Yreg: TDoubleMatrix; NInput, NComp: Integer);
   procedure Plsr(Xreg, Yreg: TDoubleMatrix; CNComp: Integer);

Executes the partial least squares on the matrix xreg and yreg. Xreg is of the form (width=dimension, height=Num examples) and Yreg the destination matrix in the form (width=Num examples, Height>=NInput). NInput is optional an defines the number of components used for the yreg projection. The overloaded function assumes the same dimension as the height of yReg. NComp are the number of left over projection vectors (like the number of left principal compoents in PCA).

   function Project(x : TDoubleMatrix) : TDoubleMatrix;

Projects the datapoint(s) x. In a linear regression this would result in the regression line.

   property SSQDif: TDoubleMatrix read fSSQDif;
   property CW: TDoubleMatrix read fCW;
   property W: TDoubleMatrix read fW;
   property Beta: TDoubleMatrix read fBeta;
   property YRes: TDoubleMatrix read fYRes;         // residual when calculating plsr
   property Theta: TDoubleMatrix read fTheta;

For completeness are here the properties listed that may be used for further calculation. e.g. the projection matrix Theta or the projected residual matrix YRes.

Note that the class supports persistence so it can easily be loaded or stored on disk.

Clone this wiki locally