版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
1、edit svmtrain >>edit svmclassify >>edit svmpredict function svm_struct, svIndex = svmtrain(training, groupnames, varargin) %SVMTRAIN trains a support vector machine classifier % % SVMStruct = SVMTRAIN(TRAINING,GROUP) trains a support vector machine % classifier using data TRAINING taken
2、from two groups given by GROUP. % SVMStruct contains information about the trained classifier that is % used by SVMCLASSIFY for classification. GROUP is a column vector of % values of the same length as TRAINING that defines two groups. Each % element of GROUP specifies the group the corresponding r
3、ow of TRAINING % belongs to. GROUP can be a numeric vector, a string array, or a cell % array of strings. SVMTRAIN treats NaNs or empty strings in GROUP as % missing values and ignores the corresponding rows of TRAINING. % % SVMTRAIN(.,'KERNEL_FUNCTION',KFUN) allows you to specify the kernel
4、 % function KFUN used to map the training data into kernel space. The % default kernel function is the dot product. KFUN can be one of the % following strings or a function handle: % % 'linear' Linear kernel or dot product % 'quadratic' Quadratic kernel % 'polynomial' Polynom
5、ial kernel (default order 3) % 'rbf' Gaussian Radial Basis Function kernel % 'mlp' Multilayer Perceptron kernel (default scale 1) % function A kernel function specified using , % for example KFUN, or an anonymous function % % A kernel function must be of the form % % function K = KFU
6、N(U, V) % % The returned value, K, is a matrix of size M-by-N, where U and V have M % and N rows respectively. If KFUN is parameterized, you can use % anonymous functions to capture the problem-dependent parameters. For % example, suppose that your kernel function is % % function k = kfun(u,v,p1,p2)
7、 % k = tanh(p1*(u*v')+p2); % % You can set values for p1 and p2 and then use an anonymous function: % (u,v) kfun(u,v,p1,p2). % % SVMTRAIN(.,'POLYORDER',ORDER) allows you to specify the order of a % polynomial kernel. The default order is 3. % % SVMTRAIN(.,'MLP_PARAMS',P1 P2) allo
8、ws you to specify the % parameters of the Multilayer Perceptron (mlp) kernel. The mlp kernel % requires two parameters, P1 and P2, where K = tanh(P1*U*V' + P2) and P1 % > 0 and P2 < 0. Default values are P1 = 1 and P2 = -1. % % SVMTRAIN(.,'METHOD',METHOD) allows you to specify the
9、method used % to find the separating hyperplane. Options are % % 'QP' Use quadratic programming (requires the Optimization Toolbox) % 'LS' Use least-squares method % % If you have the Optimization Toolbox, then the QP method is the default % method. If not, the only available method
10、is LS. % % SVMTRAIN(.,'QUADPROG_OPTS',OPTIONS) allows you to pass an OPTIONS % structure created using OPTIMSET to the QUADPROG function when using % the 'QP' method. See help optimset for more details. % % SVMTRAIN(.,'SHOWPLOT',true), when used with two-dimensional data, % c
11、reates a plot of the grouped data and plots the separating line for % the classifier. % % Example: % % Load the data and select features for classification % load fisheriris % data = meas(:,1), meas(:,2); % % Extract the Setosa class % groups = ismember(species,'setosa'); % % Randomly select
12、 training and test sets % train, test = crossvalind('holdOut',groups); % cp = classperf(groups); % % Use a linear support vector machine classifier % svmStruct = svmtrain(data(train,:),groups(train),'showplot',true); % classes = svmclassify(svmStruct,data(test,:),'showplot',t
13、rue); % % See how well the classifier performed % classperf(cp,classes,test); % cp.CorrectRate % % See also CLASSIFY, KNNCLASSIFY, QUADPROG, SVMCLASSIFY. % Copyright 2004 The MathWorks, Inc. % $Revision: 1.1.12.1 $ $Date: 2004/12/24 20:43:35 $ % References: % 1 Kecman, V, Learning and Soft Computing
14、, % MIT Press, Cambridge, MA. 2001. % 2 Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B., % Vandewalle, J., Least Squares Support Vector Machines, % World Scientific, Singapore, 2002. % 3 Scholkopf, B., Smola, A.J., Learning with Kernels, % MIT Press, Cambridge, MA. 2002. % % SVMTRAIN(
15、.,'KFUNARGS',ARGS) allows you to pass additional % arguments to kernel functions. % set defaults plotflag = false; qp_opts = ; kfunargs = ; setPoly = false; usePoly = false; setMLP = false; useMLP = false; if isempty(which('quadprog') useQuadprog = true; else useQuadprog = false; end
16、 % set default kernel function kfun = linear_kernel; % check inputs if nargin < 2 error(nargchk(2,Inf,nargin) end numoptargs = nargin -2; optargs = varargin; % grp2idx sorts a numeric grouping var ascending, and a string grouping % var by order of first occurrence g,groupString = grp2idx(groupnam
17、es); % check group is a vector - though char input is special. if isvector(groupnames) && ischar(groupnames) error('Bioinfo:svmtrain:GroupNotVector',. 'Group must be a vector.'); end % make sure that the data is correctly oriented. if size(groupnames,1) = 1 groupnames = group
18、names' end % make sure data is the right size n = length(groupnames); if size(training,1) = n if size(training,2) = n training = training' else error('Bioinfo:svmtrain:DataGroupSizeMismatch',. 'GROUP and TRAINING must have the same number of rows.') end end % NaNs are treated
19、 as unknown classes and are removed from the training % data nans = find(isnan(g); if length(nans) > 0 training(nans,:) = ; g(nans) = ; end ngroups = length(groupString); if ngroups > 2 error('Bioinfo:svmtrain:TooManyGroups',. 'SVMTRAIN only supports classification into two groups.
20、nGROUP contains %d different groups.',ngroups) end % convert to 1, -1. g = 1 - (2* (g-1); % handle optional arguments if numoptargs >= 1 if rem(numoptargs,2)= 1 error('Bioinfo:svmtrain:IncorrectNumberOfArguments',. 'Incorrect number of arguments to %s.',mfilename); end okargs
21、= 'kernel_function','method','showplot','kfunargs','quadprog_opts','polyorder','mlp_params' for j=1:2:numoptargs pname = optargsj; pval = optargsj+1; k = strmatch(lower(pname), okargs);%#ok if isempty(k) error('Bioinfo:svmtrain:UnknownParam
22、eterName',. 'Unknown parameter name: %s.',pname); elseif length(k)>1 error('Bioinfo:svmtrain:AmbiguousParameterName',. 'Ambiguous parameter name: %s.',pname); else switch(k) case 1 % kernel_function if ischar(pval) okfuns = 'linear','quadratic',. 'r
23、adial','rbf','polynomial','mlp' funNum = strmatch(lower(pval), okfuns);%#ok if isempty(funNum) funNum = 0; end switch funNum %maybe make this less strict in the future case 1 kfun = linear_kernel; case 2 kfun = quadratic_kernel; case 3,4 kfun = rbf_kernel; case 5 kfun = p
24、oly_kernel; usePoly = true; case 6 kfun = mlp_kernel; useMLP = true; otherwise error('Bioinfo:svmtrain:UnknownKernelFunction',. 'Unknown Kernel Function %s.',kfun); end elseif isa (pval, 'function_handle') kfun = pval; else error('Bioinfo:svmtrain:BadKernelFunction',.
25、 'The kernel function input does not appear to be a function handlenor valid function name.') end case 2 % method if strncmpi(pval,'qp',2) useQuadprog = true; if isempty(which('quadprog') warning('Bioinfo:svmtrain:NoOptim',. 'The Optimization Toolbox is required t
26、o use the quadratic programming method.') useQuadprog = false; end elseif strncmpi(pval,'ls',2) useQuadprog = false; else error('Bioinfo:svmtrain:UnknownMethod',. 'Unknown method option %s. Valid methods are ''QP'' and ''LS''',pval); end ca
27、se 3 % display if pval = 0 if size(training,2) = 2 plotflag = true; else warning('Bioinfo:svmtrain:OnlyPlot2D',. 'The display option can only plot 2D training data.') end end case 4 % kfunargs if iscell(pval) kfunargs = pval; else kfunargs = pval; end case 5 % quadprog_opts if isstru
28、ct(pval) qp_opts = pval; elseif iscell(pval) qp_opts = optimset(pval:); else error('Bioinfo:svmtrain:BadQuadprogOpts',. 'QUADPROG_OPTS must be an opts structure.'); end case 6 % polyorder if isscalar(pval) | isnumeric(pval) error('Bioinfo:svmtrain:BadPolyOrder',. 'POLYORD
29、ER must be a scalar value.'); end if pval =floor(pval) | pval < 1 error('Bioinfo:svmtrain:PolyOrderNotInt',. 'The order of the polynomial kernel must be a positive integer.') end kfunargs = pval; setPoly = true; case 7 % mlpparams if numel(pval)=2 error('Bioinfo:svmtrain:B
30、adMLPParams',. 'MLP_PARAMS must be a two element array.'); end if isscalar(pval(1) | isscalar(pval(2) error('Bioinfo:svmtrain:MLPParamsNotScalar',. 'The parameters of the multi-layer perceptron kernel must be scalar.'); end kfunargs = pval(1),pval(2); setMLP = true; end e
31、nd end end if setPoly && usePoly warning('Bioinfo:svmtrain:PolyOrderNotPolyKernel',. 'You specified a polynomial order but not a polynomial kernel'); end if setMLP && useMLP warning('Bioinfo:svmtrain:MLPParamNotMLPKernel',. 'You specified MLP parameters bu
32、t not an MLP kernel'); end % plot the data if requested if plotflag hAxis,hLines = svmplotdata(training,g); legend(hLines,cellstr(groupString); end % calculate kernel function try kx = feval(kfun,training,training,kfunargs:); % ensure function is symmetric kx = (kx+kx')/2; catch error('B
33、ioinfo:svmtrain:UnknownKernelFunction',. 'Error calculating the kernel function:n%sn', lasterr); end % create Hessian % add small constant eye to force stability H =(g*g').*kx) + sqrt(eps(class(training)*eye(n); if useQuadprog % The large scale solver cannot handle this type of probl
34、em, so turn it % off. qp_opts = optimset(qp_opts,'LargeScale','Off'); % X=QUADPROG(H,f,A,b,Aeq,beq,LB,UB,X0,opts) alpha = quadprog(H,-ones(n,1),. g',0,zeros(n,1),inf *ones(n,1),zeros(n,1),qp_opts); % The support vectors are the non-zeros of alpha svIndex = find(alpha > sqrt(eps); sv = training(svIndex,:); % calculate the parameters of the separating line from the support % vectors. alphaHat = g(svIndex).*alpha(svIndex); % Calculate the bias by applying the in
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- 【優(yōu)化方案】2021高考英語(外研版)總復(fù)習(xí)階段綜合檢測(cè)(一)
- 2024廢棄電器電子產(chǎn)品線上線下耦合回收集成技術(shù)規(guī)范
- 【名師一號(hào)】2020-2021學(xué)年高中英語(人教版)必修一雙基限時(shí)練14
- 人教版2022年高三第二輪復(fù)習(xí)-專題六-第1講-第1講-種群和群落
- 2022年學(xué)校教學(xué)工作總結(jié)范文
- 陜西省渭南市尚德中學(xué)2024-2025學(xué)年高一上學(xué)期第一次階段性生物試卷(含答案)
- 【全程復(fù)習(xí)方略】2020年北師版數(shù)學(xué)文(陜西用)課時(shí)作業(yè):第六章-第五節(jié)合情推理與演繹推理
- 【全程復(fù)習(xí)方略】2022屆高考化學(xué)(人教版)一輪總復(fù)習(xí)單元評(píng)估檢測(cè)(8)電化學(xué)基礎(chǔ)
- IT工作半年總結(jié):組織好工作流程-提升工作效率
- 2022高考(新課標(biāo))數(shù)學(xué)(理)大一輪復(fù)習(xí)試題:第十章-概率10-9a
- 8位半萬用表大比拼
- 品牌管理部績(jī)效考核指標(biāo)
- 《數(shù)學(xué)廣角——數(shù)與形》評(píng)課稿
- 瀝青路面施工監(jiān)理工作細(xì)則
- 物業(yè)設(shè)備設(shè)施系統(tǒng)介紹(詳細(xì)).ppt
- 公司走賬合同范本
- 獲獎(jiǎng)一等獎(jiǎng)QC課題PPT課件
- 人教版小學(xué)三年級(jí)數(shù)學(xué)上冊(cè)判斷題(共3頁)
- 國際項(xiàng)目管理手冊(cè)The Project Manager’s Manual
- 小學(xué)五年級(jí)思政課教案三篇
- 華為內(nèi)部虛擬股管理暫行條例
評(píng)論
0/150
提交評(píng)論