隱馬爾科夫模型-英文課件_第1頁(yè)
隱馬爾科夫模型-英文課件_第2頁(yè)
隱馬爾科夫模型-英文課件_第3頁(yè)
隱馬爾科夫模型-英文課件_第4頁(yè)
隱馬爾科夫模型-英文課件_第5頁(yè)
已閱讀5頁(yè),還剩69頁(yè)未讀 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

隱馬爾科夫模型

HiddenMarkovModel(HMM)HiddenMarkovModelTheproblemsabouttheTemplatemethodHMMisapopularstatisticaltoolDiscrete-TimeMarkovProcessTheoryofHMM:Thethreebasicproblems2ReviewtemplatemethodKeyideaToderivetypicalsequencesofspeechframesforapatternviasomeaveragingprocedureRelyontheuseoflocalspectraldistancemeasurestocomparepatternsDynamicprogramming,temporallyalignpatternsProblemsofTemplatemethod語(yǔ)音是一個(gè)隨機(jī)信號(hào)非嚴(yán)格意義上的統(tǒng)計(jì)方法StatisticaltechniqueshavebeenwidelyusedinclusteringtocreatereferencepatternsStatisticalsignalcharacterizationinherentinthetemplaterepresentationisonlyimplicitandofteninadequate:neglectsthesecond-orderstatistics缺乏魯棒性4HMM:populartoolThebasictheoryofHMMwaspublishedinaseriesofclassicpapersbyBaumandhiscolleaguesinthelate1960sandearly1970sHMMwasimplementedforspeech-processingapplicationsbyBakeratCMU,andbyJelinekandhiscolleaguesatIBMinthe1970sHMMprovidesanaturalandhighlyreliablewayofrecognizingspeechforawiderangeofapplications5HMM:populartoolTheunderlyingassumptionoftheHMMthespeechsignalcanbewellcharacterizedasaparametricrandomprocesstheparametersofthestochasticprocesscanbedeterminedinaprecise,well-definedmanner6Discrete-TimeMarkovProcessAsystemwithNdiscretestatesindexedby{1,2,…N}.:Thestateattimet7Discrete-TimeMarkovProcess8時(shí)不變系統(tǒng)?ObservableMarkovModelEachstatecorrespondstoanobservableeventExample:weatherState1:rainorsnowState2:cloudyState3:sunny9TheweatherisobservedonceadayCoulditbeusedforwhatcase?ExtensionstoHiddenMarkovModels

--TheUrn-and-BallModelNglassurns,eachwithMdistinctcolorballsAurnisrandomlyselectedfirst,andthenaballischosenatrandom,whosecolorisrecordedastheobservationTheballisthenreplacedintheurnfromwhichitwasselectedTheprocedureisrepeated10……2.2.TheUrn-and-BallModel11HMMforweatherforecastWhatOperationsdoyoudesigntocarryouttheballselection?HowdoyouextendtheMarkovprocesstoHMMtogivemorepreciseweatherforecast?TheoryofHMMTopologyElementsBi-hiddenprocessesThreebasicproblems13HMMTopology:Ergodic14HMMTopology:Left-right15Parallelpathleft-rightHMM16ElementsofHMMN每個(gè)模型的狀態(tài)數(shù)M每個(gè)狀態(tài)的可觀察現(xiàn)象數(shù)狀態(tài)轉(zhuǎn)移概率分布,其中狀態(tài)觀察現(xiàn)象概率分布初始狀態(tài)概率分布,其中weusethecompactnotationToindicatethecompleteparametersetofthemodel,thisparameterset,ofcourse,definesaprobabilitymeasureforO,,whichwediscusslater,weusetheterminologyHMMtoindicatetheparametersetandtheassociatedprobabilitymeasureinterchangeablywithoutambiguity.ElementsofHMM18Bi-HiddenprocessesThestatesTheobservations19TheThreeBasicProblemsEvaluation:ForwardprocessOptimalpath:ViterbiAlgorithmTraining:Baum-WelchAlgorithm20Problem1:GiventheobservationsequenceO=(o1,o2…,oT),andamodelhowdoweefficientlycompute,theprobabilityoftheobservationsequence,giventhemodel?Wecanalsoviewtheproblemasoneofscoringhowwellagivenmodelmatchesagivenobservationsequence.Tosolvetheproblemallowsustochoosethemodelthatbestmatchestheobservations.Evaluation21Problem2GiventheobservationsequenceO=(o1,o2,…,oT),andthemodelhowdowechooseacorrespondingstaticsequenceq=(q1q2,…,qt)thatisoptimalinsomesense.inthisproblemtofindthecorrectstatesequence.weusuallyuseanoptimalitycriteriontosolvethisproblemasbestaspossible.Evaluation22Problem3:Howdoweadjustthemodelparameterstomaximize

Inthisproblemweattempttooptimizethemodelparameterstobestdescribehowagivenobservationsequencecomesabout.TheobservationsequenceusedtoadjustthemodelparametersiscalledatrainingsequencebecauseitisusedtotraintheHMM.Evaluation23ProbabilityEvaluationWewishtocalculatetheprobabilityoftheobservationsequence.Consideronesuchfixed-statesequenceWhereq1istheinitialstate.TheprobabilityoftheobservationsequenceOgiventhestatesequenceofqisWherewehaveassumedstatisticalindependenceofobservation.Thusweget

24ProbabilityEvaluationTheprobabilityofsuchastatesequenceqcanbewrittenasThejointprobabilityofOandqoccursimultaneously,issimplytheproductoftheabovetwoterms25ProbabilityEvaluationTheprobabilityofOisobtainedbysummingthisjointprobabilityoverallpossiblestatesequenceq,giving26A.TheForwardProcedureConsidertheforwardvariable

definedasThatis,theprobabilityofthepartialobservationsequence,o1o2…ot,(untiltimet)andstateiattimet,giventhemodel.Wecansolveforinductively,asfollows:27ForwardProcedure1.initialization2.induction3.termination28B.TheBackwardProcedureInasimilarmanner,wecanconsiderabackwardvariabledefinedasThatis,theprobabilityofthepartialobservationsequencefromt+1totheend,givenstateiattimetandthemodelAgainwecansolveforinductively,asFollows:29BackwardProcedure1.initialization2.Induction30BackwardprocedureTheinitializationstep1arbitrarilydefinetobe1foralli.Step2,whichisillustratedinnextfigure,whichshowsthatinordertohavebeeninstateiattimet,andtoaccountfortheobservationsequencefromtimet+1on,youhavetoconsiderallpossiblestatejattimet+131accordingforthetransitionfromitoj,aswellastheobservationot+1instatej.Andthenaccountfortheremainingpartialobservationsequencefromstatej.Wewillseealterhowthebackwardaswellastheforwardcalculationareusedtohelpsolvefundamentalproblem2and3ofHMMsBackwardprocedure32……ai3ai2ai1aiNs1s2s3sNt+1tsiBackwardprocedure33Thereareseveralpossiblewaysofsolvingproblem2,findingthe“optimal”statesequenceassociatedwiththegivenobservationsequence.Toimplementthisproblem2,wecandefinethataposterioriprobabilityvariableBackwardprocedure34Thatis,theprobabilityofbeinginstateiattimet,giventheobservationsequenceO,andthemodel,wecanexpressinseveralforms,includingBackwardprocedure35SinceisequaltowecanwriteasBackwardprocedure36Whereweseethataccountsforthepartialobservationsequenceandstateiatt,whileaccountfortheremainderoftheobservationsequence,givenstateUsing,wecansolvefortheindividuallymostlikelystateattimet,asBackwardprocedure37A.The

ViterbiAlgorithmTofindthesinglebeststatesequence,q=(q1q2…qT),forthegivenobservationsequenceO=(o1o2…oT),weneedtodefinethequantity38ViterbiAlgorithmThatis,isthebestscorealongasinglepath,attimet,whichaccountsforthefirsttobservationsandendsinstatei,byinductionwehave39ViterbiAlgorithmThecompleteprocedureforfindingthebeststatesequencecannowbestatedasfollows:1.Initialization40ViterbiAlgorithm2.Recursion3.Termination41ViterbiAlgorithm4.Path(statesequence)backtrackingItshouldbenotedthattheViterbialgorithmissimilarinimplementationtotheforwardcalculation.42B.Alternative

ViterbiImplementationBytakinglogarithmsofthemodelparameters,theViterbialgorithmoftheprecedingsectioncanbeimplementedwithouttheneedforanymultiplications,thus:43ViterbiAlgorithm0.Preprocessing44ViterbiAlgorithm1.Initialization2.Recursion

45ViterbiAlgorithm3.Termination4.Backtracking46time-seriesmodeling聲學(xué)統(tǒng)計(jì)模型(語(yǔ)音識(shí)別)語(yǔ)言模型通信系統(tǒng)生物信號(hào)處理手寫字符識(shí)別面部識(shí)別—Featureextraction(FerdinandoSamariaetc.atOlivettiResearch,Ltd)手勢(shì)識(shí)別一、HMM應(yīng)用領(lǐng)域HMM的應(yīng)用471.1HMM在生物信號(hào)處理中的應(yīng)用Forproteinandnucleicacidsequenceanalysis(WashingtonUniversity)TherecognitionofHumanGenesinDNA(UniversityofCalifornia)DetectingRemoteProteinHomologies(UCSC)Estimatingaminoaciddistributions481.2HMM應(yīng)用與手勢(shì)識(shí)別Handmotionisaneffectivemeansofhumancommunicationsinrealworld49二、HMM的訓(xùn)練標(biāo)準(zhǔn)ML--MaximumLikelihoodMMI--MinimumdiscriminationinformationMDI—MaximummutualinformationMMD—MaximummodeldistanceCT–CorrectiveTrainingMCE–MinimumclassificationError50ThestandardMLdesigncriterionistouseatrainingsequenceofobservationsOtoderivethesetofmodelparameters,yieldingAnyofthereestimationalgorithmsdiscussedpreviouslyprovidesasolutiontothisoptimizationproblem.ML--MaximumLikelihood51Theminimumdiscriminationinformation(MDI)isameasureofclosenessbetweentwoprobabilitymeasuresunderthegivenconstraintRisdefinedbyWhereMDI—Maximummutualinformation52ThestandardMLcriterionistousetoestimatemodelparameters,yieldingThemutualinformationbetweenanobservationsequenceandthewordv,parameterizedby,isTheMMIcriterionistofindtheentiremodelsetsuchthatthemutualinformationismaximized,MMI–Minimumdiscriminationinformation53三、HMM的應(yīng)用問題1.Scaling2.MultipleObservationSequences3.InitialEstimatesofHMMparameters.4.EffectsofInsufficientTrainingData5.ChoiceofModel54Initially,fort=1,wesetForeacht,,intermsofthepreviouslyscaledThatis,WedeterminethescalingcoefficientasGiving3.1Scaling55EachEachSointermsofthescaledvariables,wegetFinallythetermcanbeseentobeoftheform3.1Scaling56TheonlyrealchangetotheHMMprocedurebecauseofscalingistheprocedureforcomputing.Wecannotmerelysumuptheterms,becausethesearescaledalready.However,wecanusethepropertythatThuswehaveoror3.1Scaling57Themajorproblemwithleft-rightmodelsishatonecannotuseasingleobservationsequencetotrainthemodel.Thisisbecausethetransientnatureofthestateswithinthemodelallowsonlyasmallnumberofobservationsforanystate.Hence,tohavesufficientdatatomakereliableestimatesofallmodelparameters,onehastousemultipleobservationsequences.3.2MultipleObservationSequences58HowdowechooseinitialestimatesoftheHMMparameterssothatthelocalmaximumisequaltoorascloseaspossibletotheglobalmaximumofthelikelihoodfunction?ExperiencehasshownthateitherrandomoruniforminitialestimatesoftheandAparametersareadequateforgivingusefulreestimatesoftheseparametersinalmostallcases.However,fortheBexperienceshasshownthatGoodinitialestimatesarehelpfulinthediscretesymbolcaseandareessentialinthecontinuous-distributioncase.3.3InitialEstimatesofHMMparameters594.HMMsystemforIsolatedWordRecognition1.ChoiceofModelParameters2.Segmentalk-meanssegmentationwithclustering.3.IncorporationofSateDurationintotheHMM4.HMMIsolated-DigitPerformance60Todoisolatedwordspeechrecognition,wemustperformthefollowing:1.Foreachwordvinthevocabulary,wemustbuildanHMM--thatis,wemustestimatethemodelparameter(A,B,)thatoptimizethelikelihoodofthetrainingsetobservationvectorsforthevthword.2.Foreachunknownwordtoberecognized,theprocessingshowninFigure4.1mustbecarriedout,namely,measurementoftheobservationsequence,viaafeatureanalysisofthespeechcorrespondingtotheword;followedbycalculationofmodellikelihoodsforallpossiblemodels,;followedbyselectionofthewordwhosemodellikelihoodishighest—thatis,4.1HMMRecognizerofIsolatedWords61BlockdiagramofanisolatedwordHMMrecognizer62ThefigureshowsaplotofaverageworderrorrateversusN,forthecaseofrecognitionofisolateddigits.ItcanbeseenthattheerrorissomewhatinsensitivetoN,achievingalocalminimumatN=6;however,differencesinerrorrateforvaluesofNcloseto6aresmall.4.2ChoiceofModelParametersAverageworderrorrate(foradigitsvocabulary)versusthenumberofstatesNintheHMM(afterRabineretal.[18])63Thefigureshowsacomparisonofmarginaldistributionsagainstahistogramoftheactualobservationswithinastate.Theobservationvectorsareninthorder,andthemodeldensityuses M=5mixtures.Thecovariance matricesareconstrainedtobe diagonalforeachindividual mixture.Theresultsofthe figureareforthefirstmodel stateoftheword“zero.”4.2ChoiceofModelParameters64

figureshowsacurveofaverageworderrorrateversustheparameter(onalogscale)forastandardword-recognitionexperiment.Itcanbeseenthatoveraverybroadrange()theaverageerrorrateremainsataboutaconstantvalue;however,whenissetto0(i.e.,),thentheerrorrateincreasessharply.Similarly,forcontinuousdensitiesitisimportanttoconstrainthemixturegainsaswellasthediagonalcovariancecoefficientstobegreaterthanorequaltosomeminimumvalues.4.2ChoiceofModelParameters65TheFigure(nextpage)showsalog-energyplot,anaccumulatedlog-likelihoodplot,andastatesegmentationforoneoccurrenceoftheword“six.”Thestatescorrespondroughlytothesoundsinthespokenword“six.”Theresultofsegmentingeachofthetrainingsequencesis,foreachoftheNstatejaccordingtothecurrentmodel.Theresultingmodelreestimationprocedureisusedtoreestimateallmodelparameters.Theresultingmodelisthencomparedtothepreviousmodel(bycomputingadistancescorethatreflectsthestatisticalsimilarityoftheHMMs).4.3K-meanstrainingprocedure664.3K-meanstrainingprocedureThesegmentalk-meanstrainingprocedureusedtoestimateparametervaluesfortheoptimalcontinuousmixturedensityfittoafinitenumberofobservationsequences.67Atypicalsetofhistogramsofforafive-statemodeloftheword“six”isshownintheFigure.thefirsttwostatesaccountfortheinitial/s/in“six”;thethirdstateaccountsforthetransitiontothevowel/i/;thefourthstateaccountsforthevowel;andthefifthstateaccountsforthestopandthefinal/s/sound.4.4IncorporationofSateDurationintotheHM

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

最新文檔

評(píng)論

0/150

提交評(píng)論