




版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
1、edit svmtrainedit svmclassifyedit svmpredictfunction svm_struct, svIndex = svmtrain(training, groupnames, varargin)%SVMTRAIN trains a support vector machine classifier% SVMStruct = SVMTRAIN(TRAINING,GROUP) trains a support vector machine% classifier using data TRAINING taken from two groups given by
2、 GROUP.% SVMStruct contains information about the trained classifier that is% used by SVMCLASSIFY for classification. GROUP is a column vector of% values of the same length as TRAINING that defines two groups. Each% element of GROUP specifies the group the corresponding row of TRAINING% belongs to.
3、GROUP can be a numeric vector, a string array, or a cell% array of strings. SVMTRAIN treats NaNs or empty strings in GROUP as% missing values and ignores the corresponding rows of TRAINING.% SVMTRAIN(,KERNEL_FUNCTION,KFUN) allows you to specify the kernel% function KFUN used to map the training data
4、 into kernel space. The% default kernel function is the dot product. KFUN can be one of the% following strings or a function handle:% linear Linear kernel or dot product% quadratic Quadratic kernel% polynomial Polynomial kernel (default order 3)% rbf Gaussian Radial Basis Function kernel% mlp Multil
5、ayer Perceptron kernel (default scale 1)% function A kernel function specified using ,% for example KFUN, or an anonymous function% A kernel function must be of the form% function K = KFUN(U, V)% The returned value, K, is a matrix of size M-by-N, where U and V have M% and N rows respectively. If KFU
6、N is parameterized, you can use% anonymous functions to capture the problem-dependent parameters. For% example, suppose that your kernel function is% function k = kfun(u,v,p1,p2)% k = tanh(p1*(u*v)+p2);% You can set values for pl and p2 and then use an anonymous function:% (u,v) kfun(u,v,p1,p2).% SV
7、MTRAIN(,POLYORDERQRDER) allows you to specify the order of a% polynomial kernel. The default order is 3.% SVMTRAIN(,MLP_PARAMS,P1 P2) allows you to specify the% parameters of the Multilayer Perceptron (mlp) kernel. The mlp kernel% requires two parameters, P1 and P2, where K = tanh(P1*U*V + P2) and P
8、1% 0 and P2 0. Default values are P1 = 1 and P2 = -1.% SVMTRAIN(.,METHOD,METHOD) allows you to specify the method used% to find the separating hyperplane. Options are% QP Use quadratic programming (requires the Optimization Toolbox)% LS Use least-squares method% If you have the Optimization Toolbox,
9、 then the QP method is the default% method. If not, the only available method is LS.% SVMTRAIN(.,QUADPROG_OPTS,OPTIONS) allows you to pass an OPTIONS% structure created using OPTIMSET to the QUADPROG function when using% the QP method. See help optimset for more details.% SVMTRAIN(.,SHOWPLOT,true),
10、when used with two-dimensional data,% creates a plot of the grouped data and plots the separating line for% the classifier.% Example:% % Load the data and select features for classification% load fisheriris% data = meas(:,1), meas(:,2);% % Extract the Setosa class% groups = ismember(species,setosa);
11、% % Randomly select training and test sets% train, test = crossvalind(holdOut,groups);% cp = classperf(groups);% % Use a linear support vector machine classifier% svmStruct = svmtrain(data(train,:),groups(train),showplot,true);% classes = svmclassify(svmStruct,data(test,:),showplot,true);% % See how
12、 well the classifier performed% classperf(cp,classes,test);% cp.CorrectRate% See also CLASSIFY, KNNCLASSIFY, QUADPROG, SVMCLASSIFY.% Copyright 2004 The MathWorks, Inc.% $Revision: 1.1.12.1 $ $Date: 2004/12/24 20:43:35 $% References:% 1 Kecman, V, Learning and Soft Computing,% MIT Press, Cambridge, M
13、A. 2001.% 2 Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B.,% Vandewalle, J., Least Squares Support Vector Machines,% World Scientific, Singapore, 2002.% 3 Scholkopf, B., Smola, A.J., Learning with Kernels,% MIT Press, Cambridge, MA. 2002.% SVMTRAIN(.,KFUNARGS,ARGS) allows you to pass
14、 additional% arguments to kernel functions.% set defaultsplotflag = false;qp_opts =;kfunargs = ;setPoly = false; usePoly = false;setMLP = false; useMLP = false;if isempty(which(quadprog)useQuadprog = true;elseuseQuadprog = false;end% set default kernel functionkfun = linear_kernel;% check inputsif n
15、argin 0training(nans,:)=;g(nans)=;endngroups = length(groupString);if ngroups 2error(Bioinfo:svmtrain:TooManyGroups,.SVMTRAIN only supports classification into two groups.nGROUP contains %d different groups.,ngroups) end% convert to 1, -1.g = 1 - (2* (g-1);% handle optional argumentsif numoptargs =
16、1if rem(numoptargs,2)= 1error(Bioinfo:svmtrain:lncorrectNumberOfArguments,.Incorrect number of arguments to %s.,mfilename);endokargs = kernel_function,method,showplot,kfunargs,quadprog_opts,polyorder,mlp_params;for j=1:2:numoptargspname = optargsj;pval = optargsj+1;k = strmatch(lower(pname), okargs)
17、;%#okif isempty(k)error(Bioinfo:svmtrain:UnknownParameterName,.Unknown parameter name: %s.,pname);elseif length(k)1error(Bioinfo:svmtrain:AmbiguousParameterName,.Ambiguous parameter name: %s.,pname);elseswitch(k)case 1 % kernel_functionif ischar(pval)okfuns = linear,quadratic,.radial,rbf,polynomial,
18、mlp;funNum = strmatch(lower(pval), okfuns);%#okif isempty(funNum)funNum = 0;endswitch funNum %maybe make this less strict in the futurecase 1kfun = linear_kernel;case 2kfun = quadratic_kernel;case 3,4kfun = rbf_kernel;case 5kfun = poly_kernel;usePoly = true;case 6kfun = mlp_kernel;useMLP = true;othe
19、rwiseerror(Bioinfo:svmtrain:UnknownKernelFunction,.Unknown Kernel Function %s.,kfun);endelseif isa (pval, function_handle)kfun = pval;elseerror(Bioinfo:svmtrain:BadKernelFunction,.The kernel function input does not appear to be a function handlenor valid function name.) endcase 2 % methodif strncmpi
20、(pval,qp,2)useQuadprog = true;if isempty(which(quadprog)warning(Bioinfo:svmtrain:NoOptim,.The Optimization Toolbox is required to use the quadratic programming method.) useQuadprog = false;endelseif strncmpi(pval,ls,2)useQuadprog = false;elseerror(Bioinfo:svmtrain:UnknownMethod,.Unknown method optio
21、n %s. Valid methods are QP and LS,pval);endcase 3 % displayif pval = 0if size(training,2) = 2plotflag = true;elsewarning(Bioinfo:svmtrain:OnlyPlot2D,.The display option can only plot 2D training data.)endendcase 4 % kfunargsif iscell(pval)kfunargs = pval;elsekfunargs = pval;endcase 5 % quadprog_opts
22、if isstruct(pval)qp_opts = pval;elseif iscell(pval)qp_opts = optimset(pval:);elseerror(Bioinfo:svmtrain:BadQuadprogOpts,.QUADPROG_OPTS must be an opts structure.);endcase 6 % polyorderif isscalar(pval) | isnumeric(pval)error(Bioinfo:svmtrain:BadPolyOrder,.POLYORDER must be a scalar value.);endif pva
23、l =floor(pval) | pval sqrt(eps);sv = training(svIndex,:);% calculate the parameters of the separating line from the support% vectors.alphaHat = g(svlndex).*alpha(svlndex);% Calculate the bias by applying the indicator function to the support% vector with largest alpha.maxAlpha,maxPos = max(alpha); %#okbias = g(maxPos) - sum(alphaHat.*kx(svIndex,maxPos);% an alternative method is to average the values over all support vectors% bias = mean(g(sv) - sum(alphaHat(:,ones(1,numSVs).*kx(sv,sv);% An alternative way
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- 生物學(xué)科實(shí)驗(yàn)操作經(jīng)驗(yàn)交流計(jì)劃
- 畫出童年小班藝術(shù)表現(xiàn)計(jì)劃
- 優(yōu)化流程的年度工作框架計(jì)劃
- 班級(jí)心理素質(zhì)提升活動(dòng)的案例分享計(jì)劃
- 2025年中國(guó)新型建材行業(yè)市場(chǎng)競(jìng)爭(zhēng)格局及投資方向研究報(bào)告(智研咨詢)
- 2025年鐵紅項(xiàng)目建議書
- 2025年系列自動(dòng)遙測(cè)氣象站項(xiàng)目合作計(jì)劃書
- 汽車零件互換性規(guī)則設(shè)定
- 構(gòu)建穩(wěn)定可靠的數(shù)據(jù)庫(kù)同步體系
- 三國(guó)演義的英雄氣概讀后感
- 發(fā)展?jié)h語 初級(jí)讀寫一 第二課 謝謝你
- 部編版六年級(jí)語文下冊(cè)第一單元大單元教學(xué)任務(wù)單
- 人教版小學(xué)語文1-6年級(jí)背誦內(nèi)容完整版
- 2023徐金桂“徐徐道來”(行政法知識(shí)點(diǎn))版
- 《事故汽車常用零部件修復(fù)與更換判別規(guī)范》
- 2024-2030年中國(guó)酒類流通行業(yè)發(fā)展動(dòng)態(tài)及投資盈利預(yù)測(cè)研究報(bào)告
- 物業(yè)管理如何實(shí)現(xiàn)降本增效
- DL-T825-2021電能計(jì)量裝置安裝接線規(guī)則
- 信息科技重大版 七年級(jí)下冊(cè) 互聯(lián)網(wǎng)應(yīng)用與創(chuàng)新 第一單元單元教學(xué)設(shè)計(jì) 互聯(lián)網(wǎng)創(chuàng)新應(yīng)用
- 2024年興業(yè)銀行股份有限公司校園招聘考試試題及參考答案
- 2024智慧城市城市交通基礎(chǔ)設(shè)施智能監(jiān)測(cè)技術(shù)要求
評(píng)論
0/150
提交評(píng)論