www.gusucode.com > stats 源码程序 matlab案例代码 > stats/ComputePredictionsandRegressionLossforTestDataExample.m
%% Compute Predictions and Regression Loss for Test Data %% % Generate example training data. rng(1) % For reproducibility n = 100000; X = linspace(0,1,n)'; X = [X,X.^2]; y = 1 + X*[1;2] + sin(20*X*[1;-2]) + 0.2*randn(n,1); %% % Train a GPR model using the subset of regressors (|'sr'|) approximation % method and predict using the subset of data (|'sd'|) method. Use 50 points % in the active set and sparse greedy matrix approximation (|'sgma'|) method % for active set selection. Because the scales of the first and second predictors % are different, it is good practice to standardize the data. gprMdl = fitrgp(X,y,'KernelFunction','squaredExponential','FitMethod',... 'sr','PredictMethod','sd','Basis','none','ActiveSetSize',50,... 'ActiveSetMethod','sgma','Standardize',1,'KernelParameters',[1;1]); %% % <docid:stats_ug.butnn96> accepts any combination of fitting, prediction, and active set % selection methods. In some cases it might not be possible to compute the % standard deviations of the predicted responses, hence the prediction intervals. % See <docid:stats_ug.butpfw6-3>. And, in some cases, using the exact method might be expensive % because of the size of the training data. %% % Create a compact GPR object. cgprMdl = compact(gprMdl); %% % Generate the test data. n = 4000; Xnew = linspace(0,1,n)'; Xnew = [Xnew,Xnew.^2]; ynew = 1 + Xnew*[1;2] + sin(20*Xnew*[1;-2]) + 0.2*randn(n,1); %% % Use the compact object to predict the response in test data and the prediction % intervals. [ypred,~,yci] = predict(cgprMdl,Xnew); %% % Plot the true response, predicted response, and prediction intervals. figure; plot(ynew,'r'); hold on; plot(ypred,'b') plot(yci(:,1),'k--'); plot(yci(:,2),'k--'); legend('Data','Pred','Lower 95%','Upper 95%','Location','Best'); xlabel('x'); ylabel('y'); hold off %% % Compute the mean squared error loss on the test data using the trained % GPR model. L = loss(cgprMdl,Xnew,ynew)