www.gusucode.com > stats 源码程序 matlab案例代码 > stats/BestPointOfAnOptimizedKNNClassifierExample.m
%% Best Point of an Optimized KNN Classifier % This example shows how to obtain the best point of an optimized % classifier. %% % Optimize a KNN classifier for the |ionosphere| data, meaning find % parameters that minimize the cross-validation loss. Minimize over % nearest-neighborhood sizes from 1 to 30, and over the distance functions % |'chebychev'|, |'euclidean'|, and |'minkowski'|. % % For reproducibility, set the random seed, and set the % |AcquisitionFunctionName| option to |'expected-improvement-plus'|. load ionosphere rng default num = optimizableVariable('n',[1,30],'Type','integer'); dst = optimizableVariable('dst',{'chebychev','euclidean','minkowski'},'Type','categorical'); c = cvpartition(351,'Kfold',5); fun = @(x)kfoldLoss(fitcknn(X,Y,'CVPartition',c,'NumNeighbors',x.n,... 'Distance',char(x.dst),'NSMethod','exhaustive')); results = bayesopt(fun,[num,dst],'Verbose',0,... 'AcquisitionFunctionName','expected-improvement-plus'); %% % Obtain the best point according to the default % |'min-visited-upper-confidence-interval'| criterion. x = bestPoint(results) %% % The lowest estimated cross-validation loss occurs for ten nearest % neighbors and |'chebychev'| distance. % % Careful examination of the objective function model plot shows a point % with one nearest neighbor and |'chebychev'| distance that has a lower % objective function value. Find this point using a different criterion. x = bestPoint(results,'Criterion','min-observed') %% % Also find the minimum observed objective function value, and the % iteration number at which it was observed. [x,CriterionValue,iteration] = bestPoint(results,'Criterion','min-observed')