模式識(shí)別與機(jī)器學(xué)習(xí)實(shí)驗(yàn)報(bào)告_第1頁
模式識(shí)別與機(jī)器學(xué)習(xí)實(shí)驗(yàn)報(bào)告_第2頁
模式識(shí)別與機(jī)器學(xué)習(xí)實(shí)驗(yàn)報(bào)告_第3頁
模式識(shí)別與機(jī)器學(xué)習(xí)實(shí)驗(yàn)報(bào)告_第4頁
模式識(shí)別與機(jī)器學(xué)習(xí)實(shí)驗(yàn)報(bào)告_第5頁
已閱讀5頁,還剩13頁未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡介

中南大學(xué)模式識(shí)別與機(jī)器學(xué)習(xí)實(shí)驗(yàn)報(bào)告班級(jí)學(xué)號(hào)姓名指導(dǎo)老師ProgrammingExercise1:LinearRegressionIntroductionInthisexercise,youwillimplementlinearregressionandgettoseeitworkondata.Beforestartingonthisprogrammingexercise,westronglyrecommendwatchingthevideolecturesandcompletingthereviewquestionsfortheassociatedtopics.Togetstartedwiththeexercise,youwillneedtodownloadthestartercodeandunzipitscontentstothedirectorywhereyouwishtocompletetheexercise.Ifneeded,usethecdcommandinOctavetochangetothisdirectorybeforestartingthisexercise.YoucanalsofindinstructionsforinstallingOctaveonthe\OctaveInstallation"pageonthecoursewebsite.Filesincludedinthisexerciseex1.m-Octavescriptthatwillhelpstepyouthroughtheexerciseex1multi.m-Octavescriptforthelaterpartsoftheexerciseex1data1.txt-Datasetforlinearregressionwithonevariableex1data2.txt-Datasetforlinearregressionwithmultiplevariablessubmit.m-Submissionscriptthatsendsyoursolutionstoourservers[*]warmUpExercise.m-SimpleexamplefunctioninOctave[*]plotData.m-Functiontodisplaythedataset[*]computeCost.m-Functiontocomputethecostoflinearregression[*]gradientDescent.m-Functiontorungradientdescent[$]computeCostMulti.m-Costfunctionformultiplevariables[$]gradientDescentMulti.m-Gradientdescentformultiplevariables[$]featureNormalize.m-Functiontonormalizefeatures[$]normalEqn.m-Functiontocomputethenormalequations*indicateslesyouwillneedtocomplete$indicatesextracreditexercisesThroughouttheexercise,youwillbeusingthescriptsex1.mandex1multi.m.Thesescriptssetupthedatasetfortheproblemsandmakecallstofunctionsthatyouwillwrite.Youdonotneedtomodifyeitherofthem.Youareonlyrequiredtomodifyfunctionsinotherles,byfollowingtheinstructionsinthisassignment.Forthisprogrammingexercise,youareonlyrequiredtocompletetherstpartoftheexercisetoimplementlinearregressionwithonevariable.Thesecondpartoftheexercise,whichyoumaycompleteforextracredit,coverslinearregressionwithmultiplevariables.根據(jù)實(shí)驗(yàn)內(nèi)容補(bǔ)全代碼后如下:(1)computeCost.mfunctionJ=computeCost(X,y,theta)%COMPUTECOSTComputecostforlinearregression%J=COMPUTECOST(X,y,theta)computesthecostofusingthetaasthe%parameterforlinearregressiontofitthedatapointsinXandy%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta%YoushouldsetJtothecost.res=X*theta-y;J=(res’*res)/m*0.5;%=========================================================================end(2)plotData.mfunctionplotData(x,y)%PLOTDATAPlotsthedatapointsxandyintoanewfigure%PLOTDATA(x,y)plotsthedatapointsandgivesthefigureaxeslabelsof%populationandprofit.%======================YOURCODEHERE======================%Instructions:Plotthetrainingdataintoafigureusingthe%"figure"and"plot"commands.Settheaxeslabelsusing%the"xlabel"and"ylabel"commands.Assumethe%populationandrevenuedatahavebeenpassedin%asthexandyargumentsofthisfunction.%%Hint:Youcanusethe'rx'optionwithplottohavethemarkers%appearasredcrosses.Furthermore,youcanmakethe%markerslargerbyusingplot(...,'rx','MarkerSize',10);figure;%openanewfigurewindowplot(x,y,’rx’,’MarkerSize’,10);xlabel(‘profitin$10,000s’);ylabel(‘PopulationofCityin10,0000s’);%============================================================end(3)warmUpExercise.mfunctionA=warmUpExercise()%WARMUPEXERCISEExamplefunctioninoctave%A=WARMUPEXERCISE()isanexamplefunctionthatreturnsthe5x5identitymatrixA=[];%=============YOURCODEHERE==============%Instructions:Returnthe5x5identitymatrix%Inoctave,wereturnvaluesbydefiningwhichvariables%representthereturnvalues(atthetopofthefile)%andthensetthemaccordingly.A=eye(5);%===========================================endfeatureNormalize.mfunction[X_norm,mu,sigma]=featureNormalize(X)%FEATURENORMALIZENormalizesthefeaturesinX%FEATURENORMALIZE(X)returnsanormalizedversionofXwhere%themeanvalueofeachfeatureis0andthestandarddeviation%is1.Thisisoftenagoodpreprocessingsteptodowhen%workingwithlearningalgorithms.%YouneedtosetthesevaluescorrectlyX_norm=X;mu=zeros(1,size(X,2));sigma=zeros(1,size(X,2));%======================YOURCODEHERE======================%Instructions:First,foreachfeaturedimension,computethemean%ofthefeatureandsubtractitfromthedataset,%storingthemeanvalueinmu.Next,computethe%standarddeviationofeachfeatureanddivide%eachfeaturebyit'sstandarddeviation,storing%thestandarddeviationinsigma.%%NotethatXisamatrixwhereeachcolumnisa%featureandeachrowisanexample.Youneed%toperformthenormalizationseparatelyfor%eachfeature.%%Hint:Youmightfindthe'mean'and'std'functionsuseful.%mu=mean(X_norm);sigma=std(X_norm);X_norm=(X_norm-repmat(mu,size(X,1),1))./repmat(sigma,size(X,1),1);%============================================================endcomputeCostMulti.mfunctionJ=computeCostMulti(X,y,theta)%COMPUTECOSTMULTIComputecostforlinearregressionwithmultiplevariables%J=COMPUTECOSTMULTI(X,y,theta)computesthecostofusingthetaasthe%parameterforlinearregressiontofitthedatapointsinXandy%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta%YoushouldsetJtothecost.res=X*theta-y;J=(res’*res)/m*0.5;%=========================================================================EndgradientDescent.mfunction[theta,J_history]=gradientDescent(X,y,theta,alpha,num_iters)%GRADIENTDESCENTPerformsgradientdescenttolearntheta%theta=GRADIENTDESENT(X,y,theta,alpha,num_iters)updatesthetaby%takingnum_itersgradientstepswithlearningratealpha%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamplesJ_history=zeros(num_iters,1);foriter=1:num_iters%======================YOURCODEHERE======================%Instructions:Performasinglegradientstepontheparametervector%theta.%%Hint:Whiledebugging,itcanbeusefultoprintoutthevalues%ofthecostfunction(computeCost)andgradienthere.%theta=theta-alpha/m*((X*theta-y))’*X)’;%============================================================%SavethecostJineveryiterationJ_history(iter)=computeCost(X,y,theta);endendgradientDescentMulti.mfunction[theta,J_history]=gradientDescentMulti(X,y,theta,alpha,num_iters)%GRADIENTDESCENTMULTIPerformsgradientdescenttolearntheta%theta=GRADIENTDESCENTMULTI(x,y,theta,alpha,num_iters)updatesthetaby%takingnum_itersgradientstepswithlearningratealpha%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamplesJ_history=zeros(num_iters,1);foriter=1:num_iters%======================YOURCODEHERE======================%Instructions:Performasinglegradientstepontheparametervector%theta.%%Hint:Whiledebugging,itcanbeusefultoprintoutthevalues%ofthecostfunction(computeCostMulti)andgradienthere.%t=((X*theta-y)’*X);theta=theta-alpha/m*t;%============================================================%SavethecostJineveryiterationJ_history(iter)=computeCostMulti(X,y,theta);endendnormalEqn.mfunction[theta]=normalEqn(X,y)%NORMALEQNComputestheclosed-formsolutiontolinearregression%NORMALEQN(X,y)computestheclosed-formsolutiontolinear%regressionusingthenormalequations.theta=zeros(size(X,2),1);%======================YOURCODEHERE======================%Instructions:Completethecodetocomputetheclosedformsolution%tolinearregressionandputtheresultintheta.%theta=pinv(X’*X)*X’*y;%SampleSolution%%============================================================end總體運(yùn)行結(jié)果如圖所示:ProgrammingExercise2:LogisticRegressionIntroductionInthisexercise,youwillimplementlogisticregressionandapplyittotwodifferentdatasets.Beforestartingontheprogrammingexercise,westronglyrecommendwatchingthevideolecturesandcompletingthereviewquestionsfortheassociatedtopics.Togetstartedwiththeexercise,youwillneedtodownloadthestartercodeandunzipitscontentstothedirectorywhereyouwishtocompletetheexercise.Ifneeded,usethecdcommandinOctavetochangetothisdirectorybeforestartingthisexercise.YoucanalsofindinstructionsforinstallingOctaveonthe\OctaveInstallation"pageonthecoursewebsite.Filesincludedinthisexerciseex2.m-Octavescriptthatwillhelpstepyouthroughtheexerciseex2reg.m-Octavescriptforthelaterpartsoftheexerciseex2data1.txt-Trainingsetforthersthalfoftheexerciseex2data2.txt-TrainingsetforthesecondhalfoftheexercisesubmitWeb.m-Alternativesubmissionscriptsubmit.m-SubmissionscriptthatsendsyoursolutionstoourserversmapFeature.m-FunctiontogeneratepolynomialfeaturesplotDecisionBounday.m-Functiontoplotclassier'sdecisionboundary[*]plotData.m-Functiontoplot2Dclassicationdata[*]sigmoid.m-SigmoidFunction[*]costFunction.m-LogisticRegressionCostFunction[*]predict.m-LogisticRegressionPredictionFunction[*]costFunctionReg.m-RegularizedLogisticRegressionCost*indicateslesyouwillneedtocompleteThroughouttheexercise,youwillbeusingthescriptsex2.mandex2reg.m.Thesescriptssetupthedatasetfortheproblemsandmakecallstofunctionsthatyouwillwrite.Youdonotneedtomodifyeitherofthem.Youareonlyrequiredtomodifyfunctionsinotherles,byfollowingtheinstructionsinthisassignment.根據(jù)實(shí)驗(yàn)內(nèi)容補(bǔ)全代碼后如下:plotData.mfunctionplotData(X,y)%PLOTDATAPlotsthedatapointsXandyintoanewfigure%PLOTDATA(x,y)plotsthedatapointswith+forthepositiveexamples%andoforthenegativeexamples.XisassumedtobeaMx2matrix.%CreateNewFigurefigure;holdon;%======================YOURCODEHERE======================%Instructions:Plotthepositiveandnegativeexamplesona%2Dplot,usingtheoption'k+'forthepositive%examplesand'ko'forthenegativeexamples.%positive=find(y==1);negative=find(y==0);plot(X(positive,1),X(positive,2),‘k+’,‘MarkerSize’,7,’LineWidth’,2);plot(X(negative,1),X(negative,2),‘ko’,‘MarkerFaceColor’,‘y’,’MarkSize’,7);%=========================================================================holdoff;endSigmoid.mfunctiong=sigmoid(z)%SIGMOIDComputesigmoidfunctoon%J=SIGMOID(z)computesthesigmoidofz.%Youneedtoreturnthefollowingvariablescorrectlyg=zeros(size(z));%======================YOURCODEHERE======================%Instructions:Computethesigmoidofeachvalueofz(zcanbeamatrix,%vectororscalar).g=1./(1+exp(-z));%=============================================================endcostFunction.mfunction[J,grad]=costFunction(theta,X,y)%COSTFUNCTIONComputecostandgradientforlogisticregression%J=COSTFUNCTION(theta,X,y)computesthecostofusingthetaasthe%parameterforlogisticregressionandthegradientofthecost%w.r.t.totheparameters.%Initializesomeusefulvaluesm=length(y);%numberoftrainingexamples%YouneedtoreturnthefollowingvariablescorrectlyJ=0;grad=zeros(size(theta));%======================YOURCODEHERE======================%Instructions:Computethecostofaparticularchoiceoftheta.%YoushouldsetJtothecost.%Computethepartialderivativesandsetgradtothepartial%derivativesofthecostw.r.t.eachparameterintheta%Hypothesis=sigmoid(X*theta);J=1/m*sum(-y.*log(hypothesis)-(1-y).*log(1-hypothesis))+0.5*lambda/m*(theta(2:end)’*theta(2:end));n=size(X,2);grad(1)=1/m*dot(hypothesis-y,X(:,1));fori=2:ngrad(i)=1/m*dot(hypothesis-y,X(:,i))+lambda/m*theta(i);%Note:gradshouldhavethesamedimensionsastheta%%=============================================================endPredict.mfunctionp=predict(theta,X)%PREDICTPredictwhetherthelabelis0or1usinglearnedlogistic%regressionparameterstheta%p=PREDICT(theta,X)computesthepredictionsforXusinga%thresholdat0.5(i.e.,ifsigmoid(theta'*x)>=0.5,predict1)m=size(X,1);%Numberoftrainingexamples%Youneedtoreturnthefollowingvariablescorrectlyp=zeros(m,1);%============

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

評(píng)論

0/150

提交評(píng)論