www.gusucode.com > stats 源码程序 matlab案例代码 > stats/CompareClassifiersUsingCrossValidationExample.m
%% Compare Classifiers Using Cross Validation %% % Load Fisher's iris data set. % Copyright 2015 The MathWorks, Inc. load fisheriris X = meas; Y = species; rng(1); % For reproducibility %% % Train and cross validate a naive Bayes classifier using the default % options and _k_-fold cross validation. It is good practice to specify % the class order. CVMdl1 = fitcnb(X,Y,... 'ClassNames',{'setosa','versicolor','virginica'},... 'CrossVal','on'); %% % By default, the software models the predictor distribution within each % class as a Gaussian with some mean and standard deviation. |CVMdl1| is a % |ClassificationPartitionedModel| model. %% % Create a default naive Bayes binary classifier template, and train an % error-correcting, output codes multiclass model. t = templateNaiveBayes(); CVMdl2 = fitcecoc(X,Y,'CrossVal','on','Learners',t); %% % |CVMdl2| is a |ClassificationPartitionedECOC| model. You can specify % options for the naive Bayes binary learners using the same name-value % pair arguments as for |fitcnb|. %% % Compare the out-of-sample _k_-fold classification error (proportion of % misclassified observartions). classErr1 = kfoldLoss(CVMdl1,'LossFun','ClassifErr') classErr2 = kfoldLoss(CVMdl2,'LossFun','ClassifErr') %% % |Mdl2| has a lower generalization error.