




版權(quán)說(shuō)明:本文檔由用戶(hù)提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
隱馬爾科夫模型
HiddenMarkovModel(HMM)HiddenMarkovModelTheproblemsabouttheTemplatemethodHMMisapopularstatisticaltoolDiscrete-TimeMarkovProcessTheoryofHMM:Thethreebasicproblems2ReviewtemplatemethodKeyideaToderivetypicalsequencesofspeechframesforapatternviasomeaveragingprocedureRelyontheuseoflocalspectraldistancemeasurestocomparepatternsDynamicprogramming,temporallyalignpatternsProblemsofTemplatemethod語(yǔ)音是一個(gè)隨機(jī)信號(hào)非嚴(yán)格意義上的統(tǒng)計(jì)方法StatisticaltechniqueshavebeenwidelyusedinclusteringtocreatereferencepatternsStatisticalsignalcharacterizationinherentinthetemplaterepresentationisonlyimplicitandofteninadequate:neglectsthesecond-orderstatistics缺乏魯棒性4HMM:populartoolThebasictheoryofHMMwaspublishedinaseriesofclassicpapersbyBaumandhiscolleaguesinthelate1960sandearly1970sHMMwasimplementedforspeech-processingapplicationsbyBakeratCMU,andbyJelinekandhiscolleaguesatIBMinthe1970sHMMprovidesanaturalandhighlyreliablewayofrecognizingspeechforawiderangeofapplications5HMM:populartoolTheunderlyingassumptionoftheHMMthespeechsignalcanbewellcharacterizedasaparametricrandomprocesstheparametersofthestochasticprocesscanbedeterminedinaprecise,well-definedmanner6Discrete-TimeMarkovProcessAsystemwithNdiscretestatesindexedby{1,2,…N}.:Thestateattimet7Discrete-TimeMarkovProcess8時(shí)不變系統(tǒng)?ObservableMarkovModelEachstatecorrespondstoanobservableeventExample:weatherState1:rainorsnowState2:cloudyState3:sunny9TheweatherisobservedonceadayCoulditbeusedforwhatcase?ExtensionstoHiddenMarkovModels
--TheUrn-and-BallModelNglassurns,eachwithMdistinctcolorballsAurnisrandomlyselectedfirst,andthenaballischosenatrandom,whosecolorisrecordedastheobservationTheballisthenreplacedintheurnfromwhichitwasselectedTheprocedureisrepeated10……2.2.TheUrn-and-BallModel11HMMforweatherforecastWhatOperationsdoyoudesigntocarryouttheballselection?HowdoyouextendtheMarkovprocesstoHMMtogivemorepreciseweatherforecast?TheoryofHMMTopologyElementsBi-hiddenprocessesThreebasicproblems13HMMTopology:Ergodic14HMMTopology:Left-right15Parallelpathleft-rightHMM16ElementsofHMMN每個(gè)模型的狀態(tài)數(shù)M每個(gè)狀態(tài)的可觀察現(xiàn)象數(shù)狀態(tài)轉(zhuǎn)移概率分布,其中狀態(tài)觀察現(xiàn)象概率分布初始狀態(tài)概率分布,其中weusethecompactnotationToindicatethecompleteparametersetofthemodel,thisparameterset,ofcourse,definesaprobabilitymeasureforO,,whichwediscusslater,weusetheterminologyHMMtoindicatetheparametersetandtheassociatedprobabilitymeasureinterchangeablywithoutambiguity.ElementsofHMM18Bi-HiddenprocessesThestatesTheobservations19TheThreeBasicProblemsEvaluation:ForwardprocessOptimalpath:ViterbiAlgorithmTraining:Baum-WelchAlgorithm20Problem1:GiventheobservationsequenceO=(o1,o2…,oT),andamodelhowdoweefficientlycompute,theprobabilityoftheobservationsequence,giventhemodel?Wecanalsoviewtheproblemasoneofscoringhowwellagivenmodelmatchesagivenobservationsequence.Tosolvetheproblemallowsustochoosethemodelthatbestmatchestheobservations.Evaluation21Problem2GiventheobservationsequenceO=(o1,o2,…,oT),andthemodelhowdowechooseacorrespondingstaticsequenceq=(q1q2,…,qt)thatisoptimalinsomesense.inthisproblemtofindthecorrectstatesequence.weusuallyuseanoptimalitycriteriontosolvethisproblemasbestaspossible.Evaluation22Problem3:Howdoweadjustthemodelparameterstomaximize
Inthisproblemweattempttooptimizethemodelparameterstobestdescribehowagivenobservationsequencecomesabout.TheobservationsequenceusedtoadjustthemodelparametersiscalledatrainingsequencebecauseitisusedtotraintheHMM.Evaluation23ProbabilityEvaluationWewishtocalculatetheprobabilityoftheobservationsequence.Consideronesuchfixed-statesequenceWhereq1istheinitialstate.TheprobabilityoftheobservationsequenceOgiventhestatesequenceofqisWherewehaveassumedstatisticalindependenceofobservation.Thusweget
24ProbabilityEvaluationTheprobabilityofsuchastatesequenceqcanbewrittenasThejointprobabilityofOandqoccursimultaneously,issimplytheproductoftheabovetwoterms25ProbabilityEvaluationTheprobabilityofOisobtainedbysummingthisjointprobabilityoverallpossiblestatesequenceq,giving26A.TheForwardProcedureConsidertheforwardvariable
definedasThatis,theprobabilityofthepartialobservationsequence,o1o2…ot,(untiltimet)andstateiattimet,giventhemodel.Wecansolveforinductively,asfollows:27ForwardProcedure1.initialization2.induction3.termination28B.TheBackwardProcedureInasimilarmanner,wecanconsiderabackwardvariabledefinedasThatis,theprobabilityofthepartialobservationsequencefromt+1totheend,givenstateiattimetandthemodelAgainwecansolveforinductively,asFollows:29BackwardProcedure1.initialization2.Induction30BackwardprocedureTheinitializationstep1arbitrarilydefinetobe1foralli.Step2,whichisillustratedinnextfigure,whichshowsthatinordertohavebeeninstateiattimet,andtoaccountfortheobservationsequencefromtimet+1on,youhavetoconsiderallpossiblestatejattimet+131accordingforthetransitionfromitoj,aswellastheobservationot+1instatej.Andthenaccountfortheremainingpartialobservationsequencefromstatej.Wewillseealterhowthebackwardaswellastheforwardcalculationareusedtohelpsolvefundamentalproblem2and3ofHMMsBackwardprocedure32……ai3ai2ai1aiNs1s2s3sNt+1tsiBackwardprocedure33Thereareseveralpossiblewaysofsolvingproblem2,findingthe“optimal”statesequenceassociatedwiththegivenobservationsequence.Toimplementthisproblem2,wecandefinethataposterioriprobabilityvariableBackwardprocedure34Thatis,theprobabilityofbeinginstateiattimet,giventheobservationsequenceO,andthemodel,wecanexpressinseveralforms,includingBackwardprocedure35SinceisequaltowecanwriteasBackwardprocedure36Whereweseethataccountsforthepartialobservationsequenceandstateiatt,whileaccountfortheremainderoftheobservationsequence,givenstateUsing,wecansolvefortheindividuallymostlikelystateattimet,asBackwardprocedure37A.The
ViterbiAlgorithmTofindthesinglebeststatesequence,q=(q1q2…qT),forthegivenobservationsequenceO=(o1o2…oT),weneedtodefinethequantity38ViterbiAlgorithmThatis,isthebestscorealongasinglepath,attimet,whichaccountsforthefirsttobservationsandendsinstatei,byinductionwehave39ViterbiAlgorithmThecompleteprocedureforfindingthebeststatesequencecannowbestatedasfollows:1.Initialization40ViterbiAlgorithm2.Recursion3.Termination41ViterbiAlgorithm4.Path(statesequence)backtrackingItshouldbenotedthattheViterbialgorithmissimilarinimplementationtotheforwardcalculation.42B.Alternative
ViterbiImplementationBytakinglogarithmsofthemodelparameters,theViterbialgorithmoftheprecedingsectioncanbeimplementedwithouttheneedforanymultiplications,thus:43ViterbiAlgorithm0.Preprocessing44ViterbiAlgorithm1.Initialization2.Recursion
45ViterbiAlgorithm3.Termination4.Backtracking46time-seriesmodeling聲學(xué)統(tǒng)計(jì)模型(語(yǔ)音識(shí)別)語(yǔ)言模型通信系統(tǒng)生物信號(hào)處理手寫(xiě)字符識(shí)別面部識(shí)別—Featureextraction(FerdinandoSamariaetc.atOlivettiResearch,Ltd)手勢(shì)識(shí)別一、HMM應(yīng)用領(lǐng)域HMM的應(yīng)用471.1HMM在生物信號(hào)處理中的應(yīng)用Forproteinandnucleicacidsequenceanalysis(WashingtonUniversity)TherecognitionofHumanGenesinDNA(UniversityofCalifornia)DetectingRemoteProteinHomologies(UCSC)Estimatingaminoaciddistributions481.2HMM應(yīng)用與手勢(shì)識(shí)別Handmotionisaneffectivemeansofhumancommunicationsinrealworld49二、HMM的訓(xùn)練標(biāo)準(zhǔn)ML--MaximumLikelihoodMMI--MinimumdiscriminationinformationMDI—MaximummutualinformationMMD—MaximummodeldistanceCT–CorrectiveTrainingMCE–MinimumclassificationError50ThestandardMLdesigncriterionistouseatrainingsequenceofobservationsOtoderivethesetofmodelparameters,yieldingAnyofthereestimationalgorithmsdiscussedpreviouslyprovidesasolutiontothisoptimizationproblem.ML--MaximumLikelihood51Theminimumdiscriminationinformation(MDI)isameasureofclosenessbetweentwoprobabilitymeasuresunderthegivenconstraintRisdefinedbyWhereMDI—Maximummutualinformation52ThestandardMLcriterionistousetoestimatemodelparameters,yieldingThemutualinformationbetweenanobservationsequenceandthewordv,parameterizedby,isTheMMIcriterionistofindtheentiremodelsetsuchthatthemutualinformationismaximized,MMI–Minimumdiscriminationinformation53三、HMM的應(yīng)用問(wèn)題1.Scaling2.MultipleObservationSequences3.InitialEstimatesofHMMparameters.4.EffectsofInsufficientTrainingData5.ChoiceofModel54Initially,fort=1,wesetForeacht,,intermsofthepreviouslyscaledThatis,WedeterminethescalingcoefficientasGiving3.1Scaling55EachEachSointermsofthescaledvariables,wegetFinallythetermcanbeseentobeoftheform3.1Scaling56TheonlyrealchangetotheHMMprocedurebecauseofscalingistheprocedureforcomputing.Wecannotmerelysumuptheterms,becausethesearescaledalready.However,wecanusethepropertythatThuswehaveoror3.1Scaling57Themajorproblemwithleft-rightmodelsishatonecannotuseasingleobservationsequencetotrainthemodel.Thisisbecausethetransientnatureofthestateswithinthemodelallowsonlyasmallnumberofobservationsforanystate.Hence,tohavesufficientdatatomakereliableestimatesofallmodelparameters,onehastousemultipleobservationsequences.3.2MultipleObservationSequences58HowdowechooseinitialestimatesoftheHMMparameterssothatthelocalmaximumisequaltoorascloseaspossibletotheglobalmaximumofthelikelihoodfunction?ExperiencehasshownthateitherrandomoruniforminitialestimatesoftheandAparametersareadequateforgivingusefulreestimatesoftheseparametersinalmostallcases.However,fortheBexperienceshasshownthatGoodinitialestimatesarehelpfulinthediscretesymbolcaseandareessentialinthecontinuous-distributioncase.3.3InitialEstimatesofHMMparameters594.HMMsystemforIsolatedWordRecognition1.ChoiceofModelParameters2.Segmentalk-meanssegmentationwithclustering.3.IncorporationofSateDurationintotheHMM4.HMMIsolated-DigitPerformance60Todoisolatedwordspeechrecognition,wemustperformthefollowing:1.Foreachwordvinthevocabulary,wemustbuildanHMM--thatis,wemustestimatethemodelparameter(A,B,)thatoptimizethelikelihoodofthetrainingsetobservationvectorsforthevthword.2.Foreachunknownwordtoberecognized,theprocessingshowninFigure4.1mustbecarriedout,namely,measurementoftheobservationsequence,viaafeatureanalysisofthespeechcorrespondingtotheword;followedbycalculationofmodellikelihoodsforallpossiblemodels,;followedbyselectionofthewordwhosemodellikelihoodishighest—thatis,4.1HMMRecognizerofIsolatedWords61BlockdiagramofanisolatedwordHMMrecognizer62ThefigureshowsaplotofaverageworderrorrateversusN,forthecaseofrecognitionofisolateddigits.ItcanbeseenthattheerrorissomewhatinsensitivetoN,achievingalocalminimumatN=6;however,differencesinerrorrateforvaluesofNcloseto6aresmall.4.2ChoiceofModelParametersAverageworderrorrate(foradigitsvocabulary)versusthenumberofstatesNintheHMM(afterRabineretal.[18])63Thefigureshowsacomparisonofmarginaldistributionsagainstahistogramoftheactualobservationswithinastate.Theobservationvectorsareninthorder,andthemodeldensityuses M=5mixtures.Thecovariance matricesareconstrainedtobe diagonalforeachindividual mixture.Theresultsofthe figureareforthefirstmodel stateoftheword“zero.”4.2ChoiceofModelParameters64
figureshowsacurveofaverageworderrorrateversustheparameter(onalogscale)forastandardword-recognitionexperiment.Itcanbeseenthatoveraverybroadrange()theaverageerrorrateremainsataboutaconstantvalue;however,whenissetto0(i.e.,),thentheerrorrateincreasessharply.Similarly,forcontinuousdensitiesitisimportanttoconstrainthemixturegainsaswellasthediagonalcovariancecoefficientstobegreaterthanorequaltosomeminimumvalues.4.2ChoiceofModelParameters65TheFigure(nextpage)showsalog-energyplot,anaccumulatedlog-likelihoodplot,andastatesegmentationforoneoccurrenceoftheword“six.”Thestatescorrespondroughlytothesoundsinthespokenword“six.”Theresultofsegmentingeachofthetrainingsequencesis,foreachoftheNstatejaccordingtothecurrentmodel.Theresultingmodelreestimationprocedureisusedtoreestimateallmodelparameters.Theresultingmodelisthencomparedtothepreviousmodel(bycomputingadistancescorethatreflectsthestatisticalsimilarityoftheHMMs).4.3K-meanstrainingprocedure664.3K-meanstrainingprocedureThesegmentalk-meanstrainingprocedureusedtoestimateparametervaluesfortheoptimalcontinuousmixturedensityfittoafinitenumberofobservationsequences.67Atypicalsetofhistogramsofforafive-statemodeloftheword“six”isshownintheFigure.thefirsttwostatesaccountfortheinitial/s/in“six”;thethirdstateaccountsforthetransitiontothevowel/i/;thefourthstateaccountsforthevowel;andthefifthstateaccountsforthestopandthefinal/s/sound.4.4IncorporationofSateDurationintotheHM
溫馨提示
- 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶(hù)所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶(hù)上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶(hù)上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶(hù)因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- 鉆探附屬設(shè)備企業(yè)制定與實(shí)施新質(zhì)生產(chǎn)力戰(zhàn)略研究報(bào)告
- 鋼紙紙板行業(yè)直播電商戰(zhàn)略研究報(bào)告
- 非耐火火磚頰板行業(yè)跨境出海戰(zhàn)略研究報(bào)告
- 避震山地車(chē)行業(yè)跨境出海戰(zhàn)略研究報(bào)告
- 高檔手表行業(yè)跨境出海戰(zhàn)略研究報(bào)告
- 產(chǎn)品保修合同標(biāo)準(zhǔn)文本
- 上??绝啿惋嫾用撕贤瑯颖?/a>
- 2025年天津建筑安全員A證考試題庫(kù)附答案
- 刀具供銷(xiāo)合同樣本
- 出售回租合同標(biāo)準(zhǔn)文本
- 屋頂光伏的鋼結(jié)構(gòu)施工方案
- 應(yīng)急物資儲(chǔ)備檢查改進(jìn)應(yīng)急預(yù)案
- 第15課《青春之光》課件-2024-2025學(xué)年統(tǒng)編版語(yǔ)文七年級(jí)下冊(cè)
- 2025年河南輕工職業(yè)學(xué)院?jiǎn)握新殬I(yè)技能測(cè)試題庫(kù)附答案
- 中考語(yǔ)文古詩(shī)欣賞試題匯編(課內(nèi)古詩(shī)比較閱讀)(截至2024)
- 社保工傷申請(qǐng)流程
- 云梯車(chē)作業(yè)交底
- 《孫權(quán)勸學(xué)》歷年中考文言文閱讀試題40篇(含答案與翻譯)(截至2024年)
- 《高速鐵路系統(tǒng)》課件
- 新型可瓷化膨脹防火涂料的制備及性能研究
- 《機(jī)械設(shè)計(jì)課程設(shè)計(jì)》課程標(biāo)準(zhǔn)
評(píng)論
0/150
提交評(píng)論