支持向量機(jī)matlab實現(xiàn)源代碼_第1頁
支持向量機(jī)matlab實現(xiàn)源代碼_第2頁
支持向量機(jī)matlab實現(xiàn)源代碼_第3頁
支持向量機(jī)matlab實現(xiàn)源代碼_第4頁
支持向量機(jī)matlab實現(xiàn)源代碼_第5頁
已閱讀5頁,還剩4頁未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進(jìn)行舉報或認(rèn)領(lǐng)

文檔簡介

1、edit svmtrainedit svmclassifyedit svmpredictfunction svm_struct, svIndex = svmtrain(training, groupnames, varargin)%SVMTRAIN trains a support vector machine classifier% SVMStruct = SVMTRAIN(TRAINING,GROUP) trains a support vector machine% classifier using data TRAINING taken from two groups given by

2、 GROUP.% SVMStruct contains information about the trained classifier that is% used by SVMCLASSIFY for classification. GROUP is a column vector of% values of the same length as TRAINING that defines two groups. Each% element of GROUP specifies the group the corresponding row of TRAINING% belongs to.

3、GROUP can be a numeric vector, a string array, or a cell% array of strings. SVMTRAIN treats NaNs or empty strings in GROUP as% missing values and ignores the corresponding rows of TRAINING.% SVMTRAIN(,KERNEL_FUNCTION,KFUN) allows you to specify the kernel% function KFUN used to map the training data

4、 into kernel space. The% default kernel function is the dot product. KFUN can be one of the% following strings or a function handle:% linear Linear kernel or dot product% quadratic Quadratic kernel% polynomial Polynomial kernel (default order 3)% rbf Gaussian Radial Basis Function kernel% mlp Multil

5、ayer Perceptron kernel (default scale 1)% function A kernel function specified using ,% for example KFUN, or an anonymous function% A kernel function must be of the form% function K = KFUN(U, V)% The returned value, K, is a matrix of size M-by-N, where U and V have M% and N rows respectively. If KFU

6、N is parameterized, you can use% anonymous functions to capture the problem-dependent parameters. For% example, suppose that your kernel function is% function k = kfun(u,v,p1,p2)% k = tanh(p1*(u*v)+p2);% You can set values for pl and p2 and then use an anonymous function:% (u,v) kfun(u,v,p1,p2).% SV

7、MTRAIN(,POLYORDERQRDER) allows you to specify the order of a% polynomial kernel. The default order is 3.% SVMTRAIN(,MLP_PARAMS,P1 P2) allows you to specify the% parameters of the Multilayer Perceptron (mlp) kernel. The mlp kernel% requires two parameters, P1 and P2, where K = tanh(P1*U*V + P2) and P

8、1% 0 and P2 0. Default values are P1 = 1 and P2 = -1.% SVMTRAIN(.,METHOD,METHOD) allows you to specify the method used% to find the separating hyperplane. Options are% QP Use quadratic programming (requires the Optimization Toolbox)% LS Use least-squares method% If you have the Optimization Toolbox,

9、 then the QP method is the default% method. If not, the only available method is LS.% SVMTRAIN(.,QUADPROG_OPTS,OPTIONS) allows you to pass an OPTIONS% structure created using OPTIMSET to the QUADPROG function when using% the QP method. See help optimset for more details.% SVMTRAIN(.,SHOWPLOT,true),

10、when used with two-dimensional data,% creates a plot of the grouped data and plots the separating line for% the classifier.% Example:% % Load the data and select features for classification% load fisheriris% data = meas(:,1), meas(:,2);% % Extract the Setosa class% groups = ismember(species,setosa);

11、% % Randomly select training and test sets% train, test = crossvalind(holdOut,groups);% cp = classperf(groups);% % Use a linear support vector machine classifier% svmStruct = svmtrain(data(train,:),groups(train),showplot,true);% classes = svmclassify(svmStruct,data(test,:),showplot,true);% % See how

12、 well the classifier performed% classperf(cp,classes,test);% cp.CorrectRate% See also CLASSIFY, KNNCLASSIFY, QUADPROG, SVMCLASSIFY.% Copyright 2004 The MathWorks, Inc.% $Revision: 1.1.12.1 $ $Date: 2004/12/24 20:43:35 $% References:% 1 Kecman, V, Learning and Soft Computing,% MIT Press, Cambridge, M

13、A. 2001.% 2 Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B.,% Vandewalle, J., Least Squares Support Vector Machines,% World Scientific, Singapore, 2002.% 3 Scholkopf, B., Smola, A.J., Learning with Kernels,% MIT Press, Cambridge, MA. 2002.% SVMTRAIN(.,KFUNARGS,ARGS) allows you to pass

14、 additional% arguments to kernel functions.% set defaultsplotflag = false;qp_opts =;kfunargs = ;setPoly = false; usePoly = false;setMLP = false; useMLP = false;if isempty(which(quadprog)useQuadprog = true;elseuseQuadprog = false;end% set default kernel functionkfun = linear_kernel;% check inputsif n

15、argin 0training(nans,:)=;g(nans)=;endngroups = length(groupString);if ngroups 2error(Bioinfo:svmtrain:TooManyGroups,.SVMTRAIN only supports classification into two groups.nGROUP contains %d different groups.,ngroups) end% convert to 1, -1.g = 1 - (2* (g-1);% handle optional argumentsif numoptargs =

16、1if rem(numoptargs,2)= 1error(Bioinfo:svmtrain:lncorrectNumberOfArguments,.Incorrect number of arguments to %s.,mfilename);endokargs = kernel_function,method,showplot,kfunargs,quadprog_opts,polyorder,mlp_params;for j=1:2:numoptargspname = optargsj;pval = optargsj+1;k = strmatch(lower(pname), okargs)

17、;%#okif isempty(k)error(Bioinfo:svmtrain:UnknownParameterName,.Unknown parameter name: %s.,pname);elseif length(k)1error(Bioinfo:svmtrain:AmbiguousParameterName,.Ambiguous parameter name: %s.,pname);elseswitch(k)case 1 % kernel_functionif ischar(pval)okfuns = linear,quadratic,.radial,rbf,polynomial,

18、mlp;funNum = strmatch(lower(pval), okfuns);%#okif isempty(funNum)funNum = 0;endswitch funNum %maybe make this less strict in the futurecase 1kfun = linear_kernel;case 2kfun = quadratic_kernel;case 3,4kfun = rbf_kernel;case 5kfun = poly_kernel;usePoly = true;case 6kfun = mlp_kernel;useMLP = true;othe

19、rwiseerror(Bioinfo:svmtrain:UnknownKernelFunction,.Unknown Kernel Function %s.,kfun);endelseif isa (pval, function_handle)kfun = pval;elseerror(Bioinfo:svmtrain:BadKernelFunction,.The kernel function input does not appear to be a function handlenor valid function name.) endcase 2 % methodif strncmpi

20、(pval,qp,2)useQuadprog = true;if isempty(which(quadprog)warning(Bioinfo:svmtrain:NoOptim,.The Optimization Toolbox is required to use the quadratic programming method.) useQuadprog = false;endelseif strncmpi(pval,ls,2)useQuadprog = false;elseerror(Bioinfo:svmtrain:UnknownMethod,.Unknown method optio

21、n %s. Valid methods are QP and LS,pval);endcase 3 % displayif pval = 0if size(training,2) = 2plotflag = true;elsewarning(Bioinfo:svmtrain:OnlyPlot2D,.The display option can only plot 2D training data.)endendcase 4 % kfunargsif iscell(pval)kfunargs = pval;elsekfunargs = pval;endcase 5 % quadprog_opts

22、if isstruct(pval)qp_opts = pval;elseif iscell(pval)qp_opts = optimset(pval:);elseerror(Bioinfo:svmtrain:BadQuadprogOpts,.QUADPROG_OPTS must be an opts structure.);endcase 6 % polyorderif isscalar(pval) | isnumeric(pval)error(Bioinfo:svmtrain:BadPolyOrder,.POLYORDER must be a scalar value.);endif pva

23、l =floor(pval) | pval sqrt(eps);sv = training(svIndex,:);% calculate the parameters of the separating line from the support% vectors.alphaHat = g(svlndex).*alpha(svlndex);% Calculate the bias by applying the indicator function to the support% vector with largest alpha.maxAlpha,maxPos = max(alpha); %#okbias = g(maxPos) - sum(alphaHat.*kx(svIndex,maxPos);% an alternative method is to average the values over all support vectors% bias = mean(g(sv) - sum(alphaHat(:,ones(1,numSVs).*kx(sv,sv);% An alternative way

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論