遷移學習算法研究-莊福振New_第1頁
遷移學習算法研究-莊福振New_第2頁
遷移學習算法研究-莊福振New_第3頁
遷移學習算法研究-莊福振New_第4頁
遷移學習算法研究-莊福振New_第5頁
已閱讀5頁,還剩82頁未讀, 繼續(xù)免費閱讀

下載本文檔

版權說明:本文檔由用戶提供并上傳,收益歸屬內容提供方,若內容存在侵權,請進行舉報或認領

文檔簡介

遷移學習算法研究莊福振中國科學院計算技術研究所2016年4月18日TrainingDataClassifierUnseenData(…,long,T)good!Whatif…2傳統(tǒng)監(jiān)督機器學習(1/2)2023/2/1[fromProf.QiangYang]傳統(tǒng)監(jiān)督機器學習(2/2)32023/2/1傳統(tǒng)監(jiān)督學習同源、獨立同分布兩個基本假設標注足夠多的訓練樣本在實際應用中通常不能滿足!訓練集測試集分類器訓練集測試集分類器遷移學習42023/2/1實際應用學習場景HP新聞Lenovo新聞不同源、分布不一致人工標記訓練樣本,費時耗力遷移學習運用已有的知識對不同但相關領域問題進行求解的一種新的機器學習方法放寬了傳統(tǒng)機器學習的兩個基本假設遷移學習場景(1/4)52023/2/1遷移學習場景無處不在遷移知識遷移知識圖像分類HP新聞Lenovo新聞新聞網(wǎng)頁分類異構特征空間6Theappleisthepomaceousfruitoftheappletree,speciesMalusdomesticaintherosefamilyRosaceae...BananaisthecommonnameforatypeoffruitandalsotheherbaceousplantsofthegenusMusawhichproducethiscommonlyeatenfruit...Training:TextFuture:ImagesApplesBananas遷移學習場景(2/4)2023/2/1[fromProf.QiangYang]XinJin,FuzhenZhuang,SinnoJialinPan,ChangyingDu,PingLuo,QingHe:HeterogeneousMulti-taskSemanticFeatureLearningforClassification.CIKM2015:1847-1850.TestTestTrainingTrainingClassifierClassifier72.65%DVDElectronicsElectronics84.60%ElectronicsDrop!遷移學習場景(3/4)72023/2/1[fromProf.QiangYang]8DVDElectronicsBookKitchenClothesVideogameFruitHotelTeaImpractical!遷移學習場景(4/4)2023/2/1[fromProf.QiangYang]OutlineConceptLearningforTransferLearningConceptLearningbasedonNon-negativeMatrixTri-factorizationforTransferLearningConceptLearningbasedonProbabilisticLatentSemanticAnalysisforTransferLearningTransferLearningusingAuto-encodersTransferLearningfromMultipleSourceswithAutoencoderRegularizationSupervisedRepresentationLearning:TransferLearningwithDeepAuto-encoders92023/2/1ConceptLearningbasedonNon-negativeMatrixTri-factorizationforTransferLearningConceptLearningforTransferLearning102023/2/1IntroductionManytraditionallearningtechniquesworkwellonlyundertheassumption:Trainingandtestdatafollowthesamedistribution

Training(labeled)ClassifierTest(unlabeled)FromdifferentcompaniesEnterpriseNewsClassification:includingtheclasses“ProductAnnouncement”,“Businessscandal”,“Acquisition”,……Productannouncement:HP'sjust-releasedLaserJetProP1100printerandtheLaserJetProM1130andM1210multifunctionprinters,price…performance

...AnnouncementforLenovoThinkPad

ThinkCentre–price$150offLenovoK300desktopusingcouponcode...LenovoThinkPad

ThinkCentre–price$200offLenovoIdeaPadU450plaptopusing....theirperformanceHPnewsLenovonewsDifferentdistributionFail!11ConceptLearningforTransferLearning2023/2/1Motivation(1/3)ExampleAnalysis

Productannouncement:HP'sjust-releasedLaserJetProP1100printerandtheLaserJetProM1130andM1210multifunctionprinters,price…performance

...AnnouncementforLenovoThinkPad

ThinkCentre–price$150offLenovoK300desktopusingcouponcode...LenovoThinkPad

ThinkCentre–price$200offLenovoIdeaPadU450plaptopusing....theirperformanceHPnewsLenovonewsProductwordconceptLaserJet,printer,price,performanceThinkPad,ThinkCentre,price,performanceRelatedProductannouncementdocumentclass:12Sharesomecommonwords:announcement,price,performance…indicateConceptLearningforTransferLearning2023/2/1Motivation(2/3)ExampleAnalysis:

HPLaserJet,printer,price,performanceetal.LenovoThinkpad,Thinkcentre,price,performanceetal.Thewordsexpressingthesamewordconceptaredomain-dependent

13ProductProductannouncementwordconceptindicatesTheassociationbetweenwordconceptsanddocumentclassesisdomain-independent

ConceptLearningforTransferLearning2023/2/1Motivation(3/3)14Furtherobservations:Differentdomainsmayusesamekeywordstoexpressthesameconcept(denotedasidenticalconcept)Differentdomainsmayalsousedifferentkeywordstoexpressthesameconcept(denotedasalikeconcept)Differentdomainsmayalsohavetheirowndistinctconcepts(denotedasdistinctconcept)TheidenticalandalikeconceptsareusedasthesharedconceptsforknowledgetransferWetrytomodelthesethreekindsofconceptssimultaneouslyfortransferlearningtextclassificationConceptLearningforTransferLearning2023/2/1PreliminaryKnowledgeBasicformulaofmatrixtri-factorization:wheretheinputXistheword-documentco-occurrencematrix

denotesconceptinformation,mayvaryindifferentdomainsFdenotesthedocumentclassificationinformation

indeedistheassociationbetweenwordconceptsanddocumentclasses,mayretainstablecrossdomainsGS15ConceptLearningforTransferLearning2023/2/1Previousmethod-MTrickinSDM2010(1/2)SketchmapofMTrick

SourcedomainXs

FsGsFtGtTargetdomainXtSKnowledgeTransfer16ConceptLearningforTransferLearning2023/2/1Consideringthealikeconcepts MTrick(2/2)OptimizationproblemforMTrickG0isthesupervisioninformationtheassociationSissharedasbridgetotransferknowledge17ConceptLearningforTransferLearningDualTransferLearning(Longetal.,SDM2012),consideringidenticalandalikeconcepts2023/2/1TriplexTransferLearning(TriTL)(1/5)Furtherdividethewordconceptsintothreekinds:

18F1,identicalconcepts;F2,alikeconcepts;F3,distinctconceptsInput:ssourcedomainXr(1≤r≤s)withlabelinformation,ttargetdomainXr(s+1≤r≤s+t)WeproposeTriplexTransferLearningframeworkbasedonmatrixtri-factorization(TriTLforshort)

2023/2/1ConceptLearningforTransferLearningF1,S1andS2

aresharedasthebridgeforknowledgetransferacrossdomainsThesupervisioninformationisintegratedbyGr(1≤r≤s)insourcedomainsTriTL(2/5)OptimizationProblem

192023/2/1ConceptLearningforTransferLearningTriTL(3/5)Wedevelopanalternativelyiterativealgorithmtoderivethesolutionandtheoreticallyanalyzeitsconvergence 202023/2/1ConceptLearningforTransferLearningTriTL(4/5)Classificationontargetdomains When1≤r≤s,Grcontainsthelabelinformation,soweremainitunchangedduringtheiterations

whenxibelongstoclassj,thenGr(i,j)=1,elseGr(i,j)=0Aftertheiteration,weobtaintheoutputGr(s+1≤r≤s+t),thenwecanperformclassificationaccordingtoGr212023/2/1ConceptLearningforTransferLearningTriTL(5/5)AnalysisofAlgorithmConvergence Accordingtothemethodologyofconvergenceanalysisinthetwoworks[Leeetal.,NIPS’01]and[Dingetal.,KDD’06],thefollowingtheoremholds.Theorem(Convergence):Aftereachroundofcalculatingtheiterativeformulas,theobjectivefunctionintheoptimizationproblemwillconvergemonotonically.222023/2/1ConceptLearningforTransferLearning232023/2/1rec.autosrec.motorcyclesrec.baseballrec.hockeysci.cryptsic.electronicssci.medsci.spacecomp.graphicscomp.sys.ibm.pc.hardwarecomp.sys.mac.hardwarecomp.windows.xtalk.politics.misctalk.politics.gunstalk.politics.mideasttalk.religion.miscrecscicomptalkDataPreparation(1/3)20Newsgroups Fourtopcategories,eachtopcategorycontainsfoursub-categories SentimentClassification,fourdomains:books,dvd,electronics,kitchenRandomlyselecttwodomainsassources,andtherestastargets,then6problemscanbeconstructed

ConceptLearningforTransferLearning242023/2/1rec.autosrec.motorcyclesrec.baseballrec.hockeysci.cryptsic.electronicssci.medsci.spacerec+sci-baseballcrypySourcedomainautosspaceTargetdomainFortheclassificationproblemwithonesourcedomainandonetargetdomain,wecanconstruct144()

problemsDataPreparation(2/3)Constructclassificationtasks(TraditionalTL)ConceptLearningforTransferLearning252023/2/1Constructnewtransferlearningproblemsrec.autosrec.motorcyclesrec.baseballrec.hockeysci.cryptsic.electronicssci.medsci.spacerec+sci-baseballcrypyautosspacecomp.graphicscomp.sys.ibm.pc.hardwarecomp.sys.mac.hardwarecomp.windows.xtalk.politics.misctalk.politics.gunstalk.politics.mideasttalk.religion.misccomptalkautosgraphicsMoredistinctconceptsmayexist!DataPreparation(3/3)SourcedomainTargetdomainConceptLearningforTransferLearning262023/2/1ComparedAlgorithmsConceptLearningforTransferLearningTraditionallearningAlgorithmsSupervisedLearning:LogisticRegression(LR)[Davidetal.,00]SupportVectorMachine(SVM)[Joachims,ICML’99]Semi-supervisedLearning:TSVM[Joachims,ICML’99]TransferlearningMethods:CoCC[Daietal.,KDD’07],DTL[Longetal.,SDM’12]Classificationaccuracyisusedastheevaluationmeasure272023/2/1ExperimentalResults(1/3)ConceptLearningforTransferLearningSorttheproblemswiththeaccuracyofLRDegreeoftransferdifficultyeasierGenerally,thelowerofaccuracyofLRcanindicatethehardertotransfer,whilethehigheronesindicatetheeasiertotransferharder282023/2/1ExperimentalResults(2/3)ConceptLearningforTransferLearningComparisonsamongTriTL,DTL,MTrick,CoCC,TSVM,SVMandLRondatasetrecvs.sci(144problems)TriTLcanperformwelleventheaccuracyofLRislowerthan65%292023/2/1ExperimentalResults(3/3)ConceptLearningforTransferLearningResultsonnewtransferlearningproblems,weonlyselecttheproblems,whoseaccuraciesofLRarebetween(50%,55%](Onlyslightlybetterthanrandomclassification,thustheymightbemuchmoredifficult).Weobtain65problemsTriTLalsooutperformsallthebaselinesConclusionsExplicitlydefinethreekindsofwordconcepts,i.e.,identicalconcept,alikeconceptanddistinctconceptProposeageneraltransferlearningframeworkbasedonnonnegativematrixtri-factorization,whichsimultaneouslymodelthethreekindsofconcepts(TriTL)Extensiveexperimentsshowtheeffectivenessoftheproposedapproach,especiallywhenthedistinctconceptsmayexist302023/2/1ConceptLearningforTransferLearningConceptLearningbasedonProbabilisticLatentSemanticAnalysisforTransferLearningConceptLearningforTransferLearning312023/2/1322023/2/1MotivationConceptLearningforTransferLearningProductannouncement:HP'sjust-releasedLaserJetProP1100printerandtheLaserJetProM1130andM1210multifunctionprinters,price…performance

...AnnouncementforLenovoThinkPad

ThinkCentre–price$150offLenovoK300desktopusingcouponcode...LenovoThinkPad

ThinkCentre–price$200offLenovoIdeaPadU450plaptopusing....theirperformanceHPnewsLenovonewsProductwordconceptLaserJet,printer,price,performanceThinkPad,ThinkCentre,price,performanceRelatedProductannouncementdocumentclass:Sharesomecommonwords:announcement,price,performance…indicateRetrospecttheexample

332023/2/1SomenotationsddocumentydocumentclasszwordconceptSomedefinitionse.g.,p(price|Product),p(LaserJet|Product,)wwordrdomaine.g,p(Product|Productannouncement)PreliminaryKnowledge(1/3)ConceptLearningforTransferLearning342023/2/1ConceptLearningforTransferLearningPreliminaryKnowledge(2/3)ProductLaserJet,printer,announcement,price,ThinkPad,ThinkCentre,announcement,priceProductannouncementp(w|z,r1)p(w|z,r2)p(z|y)p(w|z,r1)≠p(w|z,r2)E.g.,p(LaserJet|Product,HP)≠p(LaserJet|Product,Lenovo)p(z|y,r1)=p(z|y,r2)E.g.,p(Product|Productannoucement,HP)=p(Product|Productannoucement,Lenovo)Alikeconcept352023/2/1DualPLSA

(D-PLSA)Jointprobabilityoverallvariablesp(w,d)=p(w|z)p(z|y)p(d|y)p(y)GivendatadomainX,theproblemofmaximumloglikelihoodislogp(X;θ)=logΣz

p(Z,X;θ)

θ

includesalltheparametersp(w|z),p(z|y),p(d|y),p(y).Z

denotesallthelatentvariablesPreliminaryKnowledge(3/3)TheproposedtransferlearningalgorithmbasedonD-PLSA,denotedasHIDCConceptLearningforTransferLearning362023/2/1Identicalconceptp(w|za)p(za|y)AlikeconceptTheextensionandintensionaredomainindependentp(w|zb,r)p(zb|y)HIDC(1/3)Theextensionisdomaindependent,whiletheintensionisdomainindependentConceptLearningforTransferLearning372023/2/1Distinctconceptp(w|zc,r)p(zc|y,r)ThejointprobabilitiesofthesethreegraphicalmodelsHIDC(2/3)TheextensionandintensionarebothdomaindependentConceptLearningforTransferLearning382023/2/1Givens+t

datadomainsX={X1,…,Xs,Xs+1,…,Xs+t},withoutlossofgenerality,thefirstsdomainsaresourcedomains,andthelefttdomainsaretargetdomainsConsiderthethreekindsofconcepts:TheLog

likelihoodfunctionislogp(X;θ)=logΣz

p(Z,X;θ)

θ

includesallparametersp(w|za),p(w|zb,r),p(w|zc,r),p(za|y),p(zb|y),p(zc|y,r),p(d|y,r),p(y|r),p(r).HIDC(3/3)ConceptLearningforTransferLearning392023/2/1UsetheEMalgorithmtoderivethesolutionsEStep:ModelSolution(1/4)ConceptLearningforTransferLearning402023/2/1M

Step:ModelSolution(2/4)ConceptLearningforTransferLearning412023/2/1Semi-supervisedEMalgorithm:whenrisfromsourcedomains,thelabeledinformationp(d|y,r)isknownandp(y|r)

canbeinferedp(d|y,r)=1/ny,r,ifdbelongsyindomainr,ny,risthenumberofdocumentsinclassyindomainr,else

p(d|y,c)=0p(y|r)=ny,r/nr

,nr

isthenumberofdocumentsindomainr

whenrisfromsourcedomains,p(d|y,r)andp(y|r)keepunchangedduringtheiterations,whichsupervisetheoptimizingprocessModelSolution(3/4)ConceptLearningforTransferLearning422023/2/1ClassificationfortargetdomainsAfterweobtainthefinalsolutionsofp(w|za),p(w|zb,r),p(w|zc,r),p(za|y),p(zb|y),p(zc|y,r),p(d|y,r),p(y|r),p(r)Wecancomputetheconditionalprobabilities:

ThenthefinalpredictionisDuringtheiterations,alldomainssharep(w|za),p(za|y),p(zb|y),

whichactasthebridgeforknowledgetransferModelSolution(4/4)ConceptLearningforTransferLearning432023/2/1BaselinesComparedAlgorithmsSupervisedLearning:LogisticRegression(LG)[Davidetal.,00]SupportVectorMachine(SVM)[Joachims,ICML’99]Semi-supervisedLearning:TSVM[Joachims,ICML’99]TransferLearning:CoCC[Daietal.,KDD’07]CD-PLSA[Zhuangetal.,CIKM’10]DTL[Longetal.,SDM’12]OurMethodsHIDCMeasure:classificationaccuracyConceptLearningforTransferLearning442023/2/1Resultsonnewtransferlearningproblems,weselecttheproblems,whoseaccuraciesofLRarehigherthan50%,then334problemsareobtainedExperimentalResults(1/5)ConceptLearningforTransferLearning452023/2/1Resultsonnewtransferlearningproblems,weselecttheproblems,whoseaccuraciesofLRarehigherthan50%,then334problemsareobtainedExperimentalResults(2/5)ConceptLearningforTransferLearning462023/2/1ExperimentalResults(3/5)ConceptLearningforTransferLearning472023/2/1Sourcedomain:S

(rec.autos,

sci.space),Targetdomain:T(rec.sport.hockey,talk.politics.mideast)STSTDistinctconceptSTAlikeconceptExperimentalResults(4/5)ConceptLearningforTransferLearning482023/2/1ExperimentalResults(5/5)ConceptLearningforTransferLearningIndeed,theproposedprobabilisticmethodHIDCisalsobetterthanTriTLThismayduetothereasonthatthereismoreclearerprobabilisticexplanationofHIDCp1(z,y)=p2(z,y)orp1(z|y)=p2(z|y)whichisbetter?p(z|y)p(y)492023/2/1[1]FuzhenZhuang,PingLuo,HuiXiong,QingHe,YuhongXiong,ZhongzhiShi:ExploitingAssociationsbetweenWordClustersandDocumentClassesforCross-DomainTextCategorization.SDM2010,pp.13-24.[2]FuzhenZhuang,PingLuo,ZhiyongShen,QingHe,YuhongXiong,ZhongzhiShi,HuiXiong:CollaborativeDual-PLSA:miningdistinctionandcommonalityacrossmultipledomainsfortextclassification.CIKM2010,pp.359-368.[3]FuzhenZhuang,PingLuo,ZhiyongShen,QingHe,YuhongXiong,ZhongzhiShi,HuiXiong:MiningDistinctionandCommonalityacrossMultipleDomainsUsingGenerativeModelforTextClassification.IEEETrans.Knowl.DataEng.24(11):2025-2039(2012).[3]FuzhenZhuang,PingLuo,ChangyingDu,QingHe,ZhongzhiShi:Triplextransferlearning:exploitingbothsharedanddistinctconceptsfortextclassification.WSDM2013,pp.425-434.[4]FuzhenZhuang,PingLuo,PeifengYin,QingHe,ZhongzhiShi.:ConceptLearningforCross-domainTextClassification:aGeneralProbabilisticFramework.IJCAI2013,pp.1960-1966.ReferencesConceptLearningforTransferLearningOutlineConceptLearningforTransferLearningConceptLearningbasedonNon-negativeMatrixTri-factorizationforTransferLearningConceptLearningbasedonProbabilisticLatentSemanticAnalysisforTransferLearningTransferLearningusingAuto-encodersTransferLearningfromMultipleSourceswithAutoencoderRegularizationSupervisedRepresentationLearning:TransferLearningwithDeepAuto-encoders502023/2/1TransferLearningfromMultipleSourceswithAutoencoderRegularization512023/2/1TransferLearningUsingAuto-encoders52Motivation(1/2)TransferlearningbasedonoriginalfeaturespacemayfailtoachievehighperformanceonTargetdomaindataWeconsidertheautoencodertechniquetocollaborativelyfindanewrepresentationofbothsourceandtargetdomaindataElectronicsVideoGames

Compact;easytooperate;verygoodpicture,excited

aboutthequality;lookssharp!Averygood

game!Itisactionpacked

andfullofexcitement.Iamverymuchhooked

onthisgame.522023/2/1TransferLearningUsingAuto-encodersPreviousmethodsoftentransferfromonesourcedomaintoonetargetdomainWeconsidertheconsensusregularizedframeworkforlearningfrommultiplesourcedomainsDVDBookKitchenElectronicsWeproposeatransferlearningframeworkofconsensusregularizationautoencoderstolearnfrommultiplesourcesMotivation(2/2)532023/2/1TransferLearningUsingAuto-encodersAutoencoderNeuralNetworkMinimizingthereconstructionerrortoderivethesolution:whereh,garenonlinearactivationfunction,e.g.,Sigmoidfunction,forencodinganddecoding542023/2/1TransferLearningUsingAuto-encodersConsensusMeasure-(1/3)Example:three-classclassificationproblem,threeclassifierspredictinstancesf1f2f3f1f2f3x1111x2333x3222x4231x5313x6123ConstraintSource1:D1Source2:D2Source3:D3552023/2/1TransferLearningUsingAuto-encodersConsensusMeasure-(2/3)Example:three-classclassificationproblem,predictiononinstancexMinimalentropy,MaximalConsensusMaximalentropy,MinimalConsensusEntropybasedConsensusMeasure(Luoetal.,CIKM’08)θiistheparametervectorofclassifieri,Cistheclasslabelset562023/2/1TransferLearningUsingAuto-encodersConsensusMeasure-(3/3)Forsimplicity,theconsensusmeasureforbinaryclassificationcanberewrittenasInthiswork,weimposetheconsensusregularizationtoautoencoders,andtrytoimprovethelearningperformancefrommultiplesourcedomainssincetheireffectsonmakingthepredictionconsensusaresimilar.572023/2/1TransferLearningUsingAuto-encodersSomeNotations

SourcedomainsGivenrsourcedomains:,i.e.,

,.

ThefirstcorrespondingdatamatrixisTargetdomainThecorrespondingdatamatrixis

Thegoalistotrainaclassifier

ftomakeprecisepredictionson.582023/2/1TransferLearningUsingAuto-encodersFrameworkofCRAThedatafromallsourceandtargetdomainssharethesameencodinganddecodingweightsTheclassifierstrainedfromthenewrepresentationareregularizedtopredictthesameresultsontargetdomaindata592023/2/1TransferLearningUsingAuto-encodersOptimizationProblemofCRATheoptimizationproblem:ReconstructionError602023/2/1TransferLearningUsingAuto-encodersOptimizationProblemofCRATheoptimizationproblem:ConsensusRegularization612023/2/1TransferLearningUsingAuto-encodersOptimizationProblemofCRATheoptimizationproblem:ThetotallossofsourceclassifiersoverthecorrespondingsourcedomaindatawiththehiddenrepresentationWeighdecayterm622023/2/1TransferLearningUsingAuto-encodersTheSolutionofCRAWeusethegradientdescentmethodtoderivethesolutionofallparameters?isthelearningrate.ThetimecomplexityisO(rnmk)Theoutput:theencodinganddecodingparameters,andsourceclassifierswithlatentrepresentation.632023/2/1TransferLearningUsingAuto-encodersTargetClassifierConstructionTwoScheme:Trainthesourceclassifiersbasedonandcombinethemas,whereCombineallthesourcedomaindataasZSandtrainaunifiedclassifierusinganysupervisedlearningalgorithms,e.g.,SVM,LogisticRegression(LR).ThetwoaccuraciesaredenotedasCRAvandCRAu,respectively642023/2/1TransferLearningUsingAuto-encodersDataSets-(1/2)ImageData(fromLuoetal.,CIKM08)(Someexamples)AB

A1A2A3A4B1B2B3B4Threesources:A1B1A2B2A3B3Targetdomain:A4B4Totally,96()3-sourcevs1-targetdomain(3vs1)probleminstancescanbeconstructedfortheexperimentalevaluation652023/2/1TransferLearningUsingAuto-encodersDataSets-(2/2)SentimentClassification(fromBlitzeretal.,ACL07)Four3-sourcevs1-targetdomainclassificationproblemsareconstructedDVDBookKitchenElectronicsTheaccuracyontargetdomaindataisusedastheevaluationmeasureBothSVMandLRareusedtotrainclassifiersonthenewrepresentation662023/2/1TransferLearningUsingAuto-encodersAllComparedAlgorithmsBaselinesSupervisedlearningonoriginalfeatures:SVM

[Joachims,ICML’99],LogisticRegression(LR)[Davidetal.,00]Embeddingmethodbasedonautoencoders(EAER)[Yuetal.,ECML’13]MarginalizedStackedDenoisingAutoencoders

(mSDA)[Chenetal.,ICML’12]TransferComponentAnalysis(TCA)[Panetal.,TNN’11]Transferlearningfrommultiplesources(CCR3)(Luoetal.,CIKM’08)Ourmethod:CRAvandCRAuForthemethodswhichcannothandlemultiplesources,wetraintheclassifiersfromeachsourcedomainandmergeddataofallsources(r+1accuracies).Finally,maximal,meanandminimalvaluesarereported.672023/2/1TransferLearningUsingAuto-encoders68ExperimentalResults-(1/2)TransferLearningwithMultipleSourcesviaConsensusRegularizationAutoencodersFuzhenZhuang,XiaohuCheng,SinnoJialinPan,WenchaoYu,QingHe,andZhongzhiShiResultson96imageclassificationproblems69ExperimentalResults-(2/2)TransferLearningwithMultipleSourcesviaConsensusRegularizationAutoencodersFuzhenZhuang,XiaohuCheng,SinnoJialinPan,WenchaoYu,QingHe,andZhongzhiShiResultson4sentimentclassificationproblemsConclusionsThewellknownrepresentationlearningtechniqueautoencoderisconsidered,andweformalizetheautoencodersandconsensusregularizationintoaunifiedoptimizationframeworkExtensivecomparisonexperimentsonimageandsentimentdataareconductedtoshowtheeffectivenessoftheproposealgorithm702023/2/1TransferLearningUsingAuto-encodersSupervisedRepresentationLearning:TransferLearningwithDeepAuto-encoders712023/2/1TransferLearningUsingAuto-encodersAutoencoderisanunsupervisedfeaturelearningalgorithm,whichcannoteffectivelymakeuseofthelabelinformationLimitationofBasicAutoencoderContributionofThisWorkWeextendAutoencodertomulti-layerstructure,andincorporatethelabelasonelayerMotivation722023/2/1TransferLearningUsingAuto-encoders源領域和目標領域共享編碼和解碼權重利用KL距離對隱層空間進行約束利用多類回歸模型對類標層進行約束FrameworkofTLDA(1/5)732023/2/1TransferLearningUsingAuto-encoders目標是最小化重構誤差:DeepAutoencoderFrameworkofTLDA(2/5)742023/2/1TransferLearningUsingAuto-encodersKL距離KL距離衡量的是兩個概率分布的差異情況,計算公式如下:以上KL距離并不滿足傳

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
  • 4. 未經(jīng)權益所有人同意不得將文件中的內容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內容本身不做任何修改或編輯,并不能對任何下載內容負責。
  • 6. 下載文件中如有侵權或不適當內容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論