Personal tools
You are here: Home SLMC Home Research LWPR LWPR tutorial

LWPR tutorial

This page demonstrates the basic usage of our LWPR library based on the Matlab implementation. We'll fit an LWPR model to a simple one-dimensional toy dataset. This tutorial directly corresponds to the Matlab script "test_lwpr_1D.m". Please note that our library contains two further demonstration scripts for higher-dimensional data ("test_lwpr_2D.m" and "test_lwpr_nD.m"), which should be self-explanatory once you went through this tutorial.

% Simple script to demonstrate the LWPR algorithm
% We train a model on toy data, and later retrieve
% predictions and confidence bounds from the model

Step 0: Generate training data

% In the next lines, we create a toy dataset with Ntr=500 samples
Ntr = 500;
Xtr = 10*rand(Ntr,1);

testfunc = inline('10*sin(7.8*log(1+x))./(1+0.1*x.^2)');

Ytr = 5 + testfunc(Xtr) + 0.1*randn(Ntr,1).*Xtr;

hold on

A possible resulting plot, showing our training data, is depicted as follows. Note how the noise variance increases with the magnitude of the input data.


Step 1: Setup LWPR model, choose tuning parameters

% Now we create a new LWPR model  using "lwpr_init"
% The parameters (1,1) mean  1 input dimension,  1 output dimension
% Using "lwpr_set", we set appropriate values for model parameters
% The most important parameter is "init_D", the initial distance
% metric that is assigned to new receptive fields (local models).
% In general init_D is a square matrix corresponding to the input
% dimensionality. For 1-D it is just a scalar.
% Smaller values for init_D  yield wider receptive fields, which
% might oversmooth at the start of training and lead to slow 
% convergence.
model = lwpr_init(1,1);
model = lwpr_set(model,'init_D',20);
model = lwpr_set(model,'update_D',1);
model = lwpr_set(model,'diag_only',1);
model = lwpr_set(model,'penalty',0.0001);
model = lwpr_set(model,'init_alpha', 40);

By default, LWPR uses Gaussian kernels as the activation function of all receptive fiels. You may alternatively employ the Bisquare kernel with the following line:

model = lwpr_set(model,'kernel','BiSquare');

Now, "model" is a Matlab data structure that contains the above settings, and default values for any other LWPR parameter.

>> model

model = 

              nIn: 1
             nOut: 1
           n_data: 0
           mean_x: 0
            var_x: 0
        diag_only: 1
         update_D: 1
             meta: 0
        meta_rate: 250
          penalty: 1.0000e-04
       init_alpha: 40
          norm_in: 1
         norm_out: 1
           init_D: 20
           init_M: 4.4721
             name: ''
            w_gen: 0.1000
          w_prune: 1
      init_lambda: 0.9990
     final_lambda: 1.0000
       tau_lambda: 0.9999
          init_S2: 1.0000e-10
    add_threshold: 0.5000
           kernel: 'Gaussian'
              sub: [1x1 struct]

Step 1a: Optional, but recommended speed-up

% With the following line, we transfer the LWPR model (which is a 
% Matlab struct until now) into MEX-internal memory. This drastically
% speeds up predictions und updates, but has the slight disadvantage
% that its parameters can not be directly accessed anymore.
% After the call to "lwpr_storage",   "model" is effectively a 
% C-pointer (int32 or int64 within Matlab). Never change that variable,
% or you may loose the LWPR model!
model = lwpr_storage('Store',model);

Step 2: Training

% Now we start training the model. We just call "lwpr_update" with one
% input/output tuple after another. The function also returns the
% current prediction for the input data, which we use to keep track
% of the MSE on the training data
% For printing how the training proceeds, we also use
%  a) lwpr_num_data   to report the number of training data the model
%                     has seen
%  b) lwpr_num_rfs    to report the number of receptive fields that
%                     have been allocated so far
for k=1:20
   ind = randperm(Ntr);
   mse = 0;

   for i=1:Ntr
      [model,yp] = lwpr_update(model,Xtr(ind(i)),Ytr(ind(i)));
      mse = mse + (Ytr(ind(i),:)-yp)^2;

   nMSE = mse/Ntr/var(Ytr,1);
   fprintf(1,'#Data: %5d  #RFs: %3d  nMSE=%5.3f\n', lwpr_num_data(model),
               lwpr_num_rfs(model), nMSE);   


A typical Matlab output for these few lines is printed below. In order to detect a poor choice of initial parameters, it is always useful to keep track of how many receptive fields are allocated, and how the normalised mean square error (nMSE) evolves:

#Data:   500  #RFs:  15  nMSE=0.301
#Data:  1000  #RFs:  19  nMSE=0.116
#Data:  1500  #RFs:  21  nMSE=0.041
#Data:  2000  #RFs:  21  nMSE=0.037
#Data:  2500  #RFs:  21  nMSE=0.034
#Data:  3000  #RFs:  21  nMSE=0.032
#Data:  3500  #RFs:  21  nMSE=0.031
#Data:  4000  #RFs:  21  nMSE=0.030
#Data:  4500  #RFs:  21  nMSE=0.029
#Data:  5000  #RFs:  21  nMSE=0.029
#Data:  5500  #RFs:  21  nMSE=0.029
#Data:  6000  #RFs:  21  nMSE=0.028
#Data:  6500  #RFs:  21  nMSE=0.028
#Data:  7000  #RFs:  21  nMSE=0.028
#Data:  7500  #RFs:  21  nMSE=0.028
#Data:  8000  #RFs:  21  nMSE=0.028
#Data:  8500  #RFs:  21  nMSE=0.027
#Data:  9000  #RFs:  21  nMSE=0.027
#Data:  9500  #RFs:  21  nMSE=0.027
#Data: 10000  #RFs:  21  nMSE=0.027

Step 3: Obtain predictions for unseen data

% In the next lines, we retrieve predictions from the LWPR model
% on the regularly sampled interval [0;10]
% This is achieved by calling   "lwpr_predict"
% The first output is the prediction, the second is a 
% one-standard-deviation confidence bound.
Ntest = 500;
Xtest = linspace(0,10,Ntest);
Ytest = zeros(Ntest,1);
Conf = zeros(Ntest,1);
for k=1:500
   [Ytest(k), Conf(k)] = lwpr_predict(model,Xtest(k));
% Plot the predictions and confidence bounds
hold on

We sampled the input space regularly and evaluated the LWPR regression function and its confidence bounds. The resulting plot follows below. The prediction is indicated by a blue line, with a one-standard-deviation confidence interval depicted by the cyan lines.


Step 4a: Retrieve Matlab structure from MEX storage (cf. Step 1a)

% With the following line, we transform the MEX-internal LWPR model
% back into a Matlab struct, and also free the internal memory.
% After this, we can still use "model" to calculate predictions,
% but this would be slower as before. However, we can now access all
% internal variables, which we exploit to visualize the receptive
% fields. 
% Note that this step is also necessary if you wish to save the 
% LWPR model together with the rest of you Matlab workspace, since
% Matlab does not "know" about the internal storage scheme.
model = lwpr_storage('GetFree',model);

Step 4b: Inspect LWPR model

% Any receptive field can be accessed by 
%   model.sub(outDim).rfs(index)
% where outDim is the output dimension (here only =1)  and index 
% selects the RF within that output dimension. 
% Thus, model.sub(1).rfs(5).D   corresponds to the distance metric
% of the 5th receptive field.  The field "c" denotes the center
% of that field,  beta0 is the offset, and beta the PLS regression
% parameter (in 1-D equivalent to +/- the slope of the local model)
t = linspace(0,2*pi,50);
ct = cos(t);
st = sin(t);

for k=1:length(model.sub(1).rfs); 
   r = 1/sqrt(model.sub(1).rfs(k).D);
   x = model.sub.rfs(k).c;
   y = model.sub.rfs(k).beta0;
   b = model.sub.rfs(k).beta * model.sub.rfs(k).U;

The resulting plot looks like this:

Document Actions