版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
隱馬爾科夫模型
HiddenMarkovModel(HMM)隱馬爾科夫模型
HiddenMarkovModel(HMHiddenMarkovModelTheproblemsabouttheTemplatemethodHMMisapopularstatisticaltoolDiscrete-TimeMarkovProcessTheoryofHMM:Thethreebasicproblems2HiddenMarkovModelTheproblemReviewtemplatemethodKeyideaToderivetypicalsequencesofspeechframesforapatternviasomeaveragingprocedureRelyontheuseoflocalspectraldistancemeasurestocomparepatternsDynamicprogramming,temporallyalignpatternsReviewtemplatemethodKeyideaProblemsofTemplatemethod語音是一個(gè)隨機(jī)信號(hào)非嚴(yán)格意義上的統(tǒng)計(jì)方法StatisticaltechniqueshavebeenwidelyusedinclusteringtocreatereferencepatternsStatisticalsignalcharacterizationinherentinthetemplaterepresentationisonlyimplicitandofteninadequate:neglectsthesecond-orderstatistics缺乏魯棒性4ProblemsofTemplatemethod語音是HMM:populartoolThebasictheoryofHMMwaspublishedinaseriesofclassicpapersbyBaumandhiscolleaguesinthelate1960sandearly1970sHMMwasimplementedforspeech-processingapplicationsbyBakeratCMU,andbyJelinekandhiscolleaguesatIBMinthe1970sHMMprovidesanaturalandhighlyreliablewayofrecognizingspeechforawiderangeofapplications5HMM:populartoolThebasictheHMM:populartoolTheunderlyingassumptionoftheHMMthespeechsignalcanbewellcharacterizedasaparametricrandomprocesstheparametersofthestochasticprocesscanbedeterminedinaprecise,well-definedmanner6HMM:populartoolTheunderlyinDiscrete-TimeMarkovProcessAsystemwithNdiscretestatesindexedby{1,2,…N}.:Thestateattimet7Discrete-TimeMarkovProcessADiscrete-TimeMarkovProcess8時(shí)不變系統(tǒng)?Discrete-TimeMarkovProcess8時(shí)ObservableMarkovModelEachstatecorrespondstoanobservableeventExample:weatherState1:rainorsnowState2:cloudyState3:sunny9TheweatherisobservedonceadayCoulditbeusedforwhatcase?ObservableMarkovModelEachstExtensionstoHiddenMarkovModels
--TheUrn-and-BallModelNglassurns,eachwithMdistinctcolorballsAurnisrandomlyselectedfirst,andthenaballischosenatrandom,whosecolorisrecordedastheobservationTheballisthenreplacedintheurnfromwhichitwasselectedTheprocedureisrepeated10ExtensionstoHiddenMarkovMo……2.2.TheUrn-and-BallModel11……2.2.TheUrn-and-BallModel11HMMforweatherforecastWhatOperationsdoyoudesigntocarryouttheballselection?HowdoyouextendtheMarkovprocesstoHMMtogivemorepreciseweatherforecast?HMMforweatherforecastWhatOTheoryofHMMTopologyElementsBi-hiddenprocessesThreebasicproblems13TheoryofHMMTopology13HMMTopology:Ergodic14HMMTopology:Ergodic14HMMTopology:Left-right15HMMTopology:Left-right15Parallelpathleft-rightHMM16Parallelpathleft-rightHMM16ElementsofHMMN每個(gè)模型的狀態(tài)數(shù)M每個(gè)狀態(tài)的可觀察現(xiàn)象數(shù)狀態(tài)轉(zhuǎn)移概率分布,其中狀態(tài)觀察現(xiàn)象概率分布初始狀態(tài)概率分布,其中ElementsofHMMN每個(gè)模型的狀態(tài)數(shù)weusethecompactnotationToindicatethecompleteparametersetofthemodel,thisparameterset,ofcourse,definesaprobabilitymeasureforO,,whichwediscusslater,weusetheterminologyHMMtoindicatetheparametersetandtheassociatedprobabilitymeasureinterchangeablywithoutambiguity.ElementsofHMM18weusethecompactnotationElBi-HiddenprocessesThestatesTheobservations19Bi-HiddenprocessesThestates1TheThreeBasicProblemsEvaluation:ForwardprocessOptimalpath:ViterbiAlgorithmTraining:Baum-WelchAlgorithm20TheThreeBasicProblemsEvaluaProblem1:GiventheobservationsequenceO=(o1,o2…,oT),andamodelhowdoweefficientlycompute,theprobabilityoftheobservationsequence,giventhemodel?Wecanalsoviewtheproblemasoneofscoringhowwellagivenmodelmatchesagivenobservationsequence.Tosolvetheproblemallowsustochoosethemodelthatbestmatchestheobservations.Evaluation21Problem1:Evaluation21Problem2GiventheobservationsequenceO=(o1,o2,…,oT),andthemodelhowdowechooseacorrespondingstaticsequenceq=(q1q2,…,qt)hisproblemtofindthecorrectstatesequence.weusuallyuseanoptimalitycriteriontosolvethisproblemasbestaspossible.Evaluation22Problem2Evaluation22Problem3:Howdoweadjustthemodelparameterstomaximize
Inthisproblemweattempttooptimizethemodelparameterstobestdescribehowagivenobservationsequencecomesabout.TheobservationsequenceusedtoadjustthemodelparametersiscalledatrainingsequencebecauseitisusedtotraintheHMM.Evaluation23Problem3:InthisproblemweatProbabilityEvaluationWewishtocalculatetheprobabilityoftheobservationsequence.Consideronesuchfixed-statesequenceWhereq1istheinitialstate.TheprobabilityoftheobservationsequenceOgiventhestatesequenceofqisWherewehaveassumedstatisticalindependenceofobservation.Thusweget
24ProbabilityEvaluationWewishProbabilityEvaluationTheprobabilityofsuchastatesequenceqcanbewrittenasThejointprobabilityofOandqoccursimultaneously,issimplytheproductoftheabovetwoterms25ProbabilityEvaluationTheprobProbabilityEvaluationTheprobabilityofOisobtainedbysummingthisjointprobabilityoverallpossiblestatesequenceq,giving26ProbabilityEvaluationTheprobA.TheForwardProcedureConsidertheforwardvariable
definedasThatis,theprobabilityofthepartialobservationsequence,o1o2…ot,(untiltimet)andstateiattimet,giventhemodel.Wecansolveforinductively,asfollows:27A.TheForwardProcedureConsidForwardProcedure1.initialization2.induction3.termination28ForwardProcedure1.initializatB.TheBackwardProcedureInasimilarmanner,wecanconsiderabackwardvariabledefinedasThatis,theprobabilityofthepartialobservationsequencefromt+1totheend,givenstateiattimetandthemodelAgainwecansolveforinductively,asFollows:29B.TheBackwardProcedureInaBackwardProcedure1.initialization2.Induction30BackwardProcedure1.initializaBackwardprocedureTheinitializationstep1arbitrarilydefinetobe1foralli.Step2,whichisillustratedinnextfigure,whichshowsthatinordertohavebeeninstateiattimet,andtoaccountfortheobservationsequencefromtimet+1on,youhavetoconsiderallpossiblestatejattimet+131BackwardprocedureTheinitialiaccordingforthetransitionfromitoj,aswellastheobservationot+1instatej.Andthenaccountfortheremainingpartialobservationsequencefromstatej.Wewillseealterhowthebackwardaswellastheforwardcalculationareusedtohelpsolvefundamentalproblem2and3ofHMMsBackwardprocedure32accordingforthetransitionf……ai3ai2ai1aiNs1s2s3sNt+1tsiBackwardprocedure33……ai3ai2ai1aiNs1s2s3sNt+1tsiBaThereareseveralpossiblewaysofsolvingproblem2,findingthe“optimal”statesequenceassociatedwiththegivenobservationsequence.Toimplementthisproblem2,wecandefinethataposterioriprobabilityvariableBackwardprocedure34ThereareseveralpossiblewayThatis,theprobabilityofbeinginstateiattimet,giventheobservationsequenceO,andthemodel,wecanexpressinseveralforms,includingBackwardprocedure35Thatis,theprobabilityofSinceisequaltowecanwriteasBackwardprocedure36SinceisWhereweseethataccountsforthepartialobservationsequenceandstateiatt,whileaccountfortheremainderoftheobservationsequence,givenstateUsing,wecansolvefortheindividuallymostlikelystateattimet,asBackwardprocedure37WhereweseethataccouA.TheViterbiAlgorithmTofindthesinglebeststatesequence,q=(q1q2…qT),forthegivenobservationsequenceO=(o1o2…oT),weneedtodefinethequantity38A.TheViterbiAlgorithmTofinViterbiAlgorithmThatis,isthebestscorealongasinglepath,attimet,whichaccountsforthefirsttobservationsandendsinstatei,byinductionwehave39ViterbiAlgorithmThatis,ViterbiAlgorithmThecompleteprocedureforfindingthebeststatesequencecannowbestatedasfollows:1.Initialization40ViterbiAlgorithmThecompleteViterbiAlgorithm2.Recursion3.Termination41ViterbiAlgorithm2.Recursion4ViterbiAlgorithm4.Path(statesequence)backtrackingItshouldbenotedthattheViterbialgorithmissimilarinimplementationtotheforwardcalculation.42ViterbiAlgorithm4.Path(staB.AlternativeViterbiImplementationBytakinglogarithmsofthemodelparameters,theViterbialgorithmoftheprecedingsectioncanbeimplementedwithouttheneedforanymultiplications,thus:43B.AlternativeViterbiImplemenViterbiAlgorithm0.Preprocessing44ViterbiAlgorithm0.PreprocesViterbiAlgorithm1.Initialization2.Recursion
45ViterbiAlgorithm1.InitializaViterbiAlgorithm3.Termination4.Backtracking46ViterbiAlgorithm3.Terminatitime-seriesmodeling聲學(xué)統(tǒng)計(jì)模型(語音識(shí)別)語言模型通信系統(tǒng)生物信號(hào)處理手寫字符識(shí)別面部識(shí)別—Featureextraction(FerdinandoSamariaetc.atOlivettiResearch,Ltd)手勢(shì)識(shí)別一、HMM應(yīng)用領(lǐng)域HMM的應(yīng)用47time-seriesmodeling聲學(xué)統(tǒng)計(jì)模型1.1HMM在生物信號(hào)處理中的應(yīng)用Forproteinandnucleicacidsequenceanalysis(WashingtonUniversity)TherecognitionofHumanGenesinDNA(UniversityofCalifornia)DetectingRemoteProteinHomologies(UCSC)Estimatingaminoaciddistributions481.1HMM在生物信號(hào)處理中的應(yīng)用Forproteina1.2HMM應(yīng)用與手勢(shì)識(shí)別Handmotionisaneffectivemeansofhumancommunicationsinrealworld491.2HMM應(yīng)用與手勢(shì)識(shí)別Handmotionisa二、HMM的訓(xùn)練標(biāo)準(zhǔn)ML--MaximumLikelihoodMMI--MinimumdiscriminationinformationMDI—MaximummutualinformationMMD—MaximummodeldistanceCT–CorrectiveTrainingMCE–MinimumclassificationError50二、HMM的訓(xùn)練標(biāo)準(zhǔn)ML--MaximumLikelihoThestandardMLdesigncriterionistouseatrainingsequenceofobservationsOtoderivethesetofmodelparameters,yieldingAnyofthereestimationalgorithmsdiscussedpreviouslyprovidesasolutiontothisoptimizationproblem.ML--MaximumLikelihood51ThestandardMLdesigncriteriTheminimumdiscriminationinformation(MDI)isameasureofclosenessbetweentwoprobabilitymeasuresunderthegivenconstraintRisdefinedbyWhereMDI—Maximummutualinformation52TheminimumdiscriminationinfThestandardMLcriterionistousetoestimatemodelparameters,yieldingThemutualinformationbetweenanobservationsequenceandthewordv,parameterizedby,isTheMMIcriterionistofindtheentiremodelsetsuchthatthemutualinformationismaximized,MMI–Minimumdiscriminationinformation53ThestandardMLcriterionist三、HMM的應(yīng)用問題1.Scaling2.MultipleObservationSequences3.InitialEstimatesofHMMparameters.4.EffectsofInsufficientTrainingData5.ChoiceofModel54三、HMM的應(yīng)用問題1.Scaling54Initially,fort=1,wesetForeacht,,intermsofthepreviouslyscaledThatis,WedeterminethescalingcoefficientasGiving3.1Scaling55Initially,fort=1,weset3.1EachEachSointermsofthescaledvariables,wegetFinallythetermcanbeseentobeoftheform3.1Scaling56Each3.1Scaling56TheonlyrealchangetotheHMMprocedurebecauseofscalingistheprocedureforcomputing.Wecannotmerelysumuptheterms,becausethesearescaledalready.However,wecanusethepropertythatThuswehaveoror3.1Scaling57TheonlyrealchangetotheHMThemajorproblemwithleft-rightmodelsishatonecannotuseasingleobservationsequencetotrainthemodel.Thisisbecausethetransientnatureofthestateswithinthemodelallowsonlyasmallnumberofobservationsforanystate.Hence,tohavesufficientdatatomakereliableestimatesofallmodelparameters,onehastousemultipleobservationsequences.3.2MultipleObservationSequences58Themajorproblemwithleft-riHowdowechooseinitialestimatesoftheHMMparameterssothatthelocalmaximumisequaltoorascloseaspossibletotheglobalmaximumofthelikelihoodfunction?ExperiencehasshownthateitherrandomoruniforminitialestimatesoftheandAparametersareadequateforgivingusefulreestimatesoftheseparametersinalmostallcases.However,fortheBexperienceshasshownthatGoodinitialestimatesarehelpfulinthediscretesymbolcaseandareessentialinthecontinuous-distributioncase.3.3InitialEstimatesofHMMparameters59Howdowechooseinitialestim4.HMMsystemforIsolatedWordRecognition1.ChoiceofModelParameters2.Segmentalk-meanssegmentationwithclustering.3.IncorporationofSateDurationintotheHMM4.HMMIsolated-DigitPerformance604.HMMsystemforIsolatedWorTodoisolatedwordspeechrecognition,wemustperformthefollowing:1.Foreachwordvinthevocabulary,wemustbuildanHMM--thatis,wemustestimatethemodelparameter(A,B,)thatoptimizethelikelihoodofthetrainingsetobservationvectorsforthevthword.2.Foreachunknownwordtoberecognized,theprocessingshowninFigure4.1mustbecarriedout,namely,measurementoftheobservationsequence,viaafeatureanalysisofthespeechcorrespondingtotheword;followedbycalculationofmodellikelihoodsforallpossiblemodels,;followedbyselectionofthewordwhosemodellikelihoodishighest—thatis,4.1HMMRecognizerofIsolatedWords61TodoisolatedwordspeechrecBlockdiagramofanisolatedwordHMMrecognizer62BlockdiagramofanisolatedwThefigureshowsaplotofaverageworderrorrateversusN,forthecaseofrecognitionofisolateddigits.ItcanbeseenthattheerrorissomewhatinsensitivetoN,achievingalocalminimumatN=6;however,differencesinerrorrateforvaluesofNcloseto6aresmall.4.2ChoiceofModelParametersAverageworderrorrate(foradigitsvocabulary)versusthenumberofstatesNintheHMM(afterRabineretal.[18])63ThefigureshowsaplotofaveThefigureshowsacomparisonofmarginaldistributionsagainstahistogramoftheactualobservationswithinastate.Theobservationvectorsareninthorder,andthemodeldensityuses M=5mixtures.Thecovariance matricesareconstrainedtobe diagonalforeachindividual mixture.Theresultsofthe figureareforthefirstmodel stateoftheword“zero.”4.2ChoiceofModelParameters64Thefigureshowsacomparison
figureshowsacurveofaverageworderrorrateversustheparameter(onalogscale)forastandardword-recognitionexperiment.Itcanbeseenthatoveraverybroadrange()theaverageerrorrateremainsataboutaconstantvalue;however,whenissetto0(i.e.,),thentheerrorrateincreasessharply.Similarly,forcontinuousdensitiesitisimportanttoconstrainthemixturegainsaswellasthediagonalcovariancecoefficientstobegreaterthanorequaltosomeminimumvalues.4.2ChoiceofModelParameters65experiment.ItcanbTheFigure(nextpage)showsalog-energyplot,anaccumulatedlog-likelihoodplot,andastatesegmentationforoneoccurrenceoftheword“six.”Thestatescorrespondroughlytothesoundsinthespokenword“six.”Theresultofsegmentingeachofthetrainingsequencesis,foreachoftheNstatejaccordingtothecurrentmodel.Theresultingmodelreestimationprocedureisusedtoreestimateallmodelparameters.Theresultingmodelisthencomparedtothepreviousmodel(bycomputingadistancescorethatreflectsthestatisticalsimilarityoftheHMMs).4.3K-meanstrainingprocedure66TheFigure(nextpage)showsa4.3K-meanstrainingprocedureThesegmentalk-meanstrainingprocedureusedtoestimateparametervaluesfortheoptimalcontinuousmixturedensityfittoafinitenumberofobservationsequences.674.3K-meanstrainingprocedureAtypicalsetofhistogramsofforafive-statemodeloftheword“six”isshownintheFigure.thefirsttwostatesaccountfortheinitial/s/in“six”;thethirdstateaccountsforthetransitiontothevowel/i/;thefourthstateaccountsforthevowel;andthefifthstateaccountsforthestopandthefinal/s/sound.4.4IncorporationofSateDurationintotheHMM68AtypicalsetofhistogramsofFirstthenormalViterbialgorithmisusedtogivethebestsegmentationoftheobservationsequenceoftheunknownwordintostatesviaabacktrackingprocedure.Thedurationofeachstateisthenmeasuredfromthestatesegmentation.Apostprocessorthenincrementsthelog-likelihoodscoreoftheViterbialgorithm,bythequantity4.4IncorporationofSateDurationintotheHMM69FirstthenormalViterbialgorTheresultsoftherecognitiontestsaregiveninthetable(nextpage).Therecognizersarethefollowing:LPC/DTWConventionaltemplate-basedrecognizerusingdynamictimewarping(DTW)alignmentLPC/DTW/VQConventionalrecognizerwithvectorquantizationofthefeaturevectors(M=64)HMM/VQHMMrecognizerwithM=64codebookHMM/CDHMMrecognizerusingcontinuousdensitymodelwithM=5mixturesperstateHMM/ARHMMrecognizerusingautoregressiveobservationdensity4.5HMMIsolated-DigitPerformance70TheresultsoftherecognitionAverageDigitErrorRatesforSeveralRecognizersandEvaluationSetsEvaluationSetOriginalTrainingTS2TS3TS4RecognizerTypeLPC/DTW 0.1 0.2 2.0 1.1LPC/DTW/VQ - 3.5 - -HMM/VQ - 3.7 - -HMM/CD 0 0.2 1.3 1.8HMM/AR 0.3 1.8 3.4 4.14.5HMMIsolated-DigitPerformance71AverageDigitErrorRatesfor五、HMM的缺陷狀態(tài)獨(dú)立假設(shè)基于幀的特征提取粗糙的狀態(tài)時(shí)長(zhǎng)模型:指數(shù)分布模型訓(xùn)練方法的不完善性:非判決性、非全局最優(yōu)72五、HMM的缺陷狀態(tài)獨(dú)立假設(shè)72一些大公司對(duì)HMM的研究及應(yīng)用HMMwasimplementedforspeech-processingapplicationsatIBMinthe1970s.ThekeytechniqueofIBMViaVoicesystem.IBM在1999年底推出了ViaVoice電話語音識(shí)別技術(shù)Intel將向語音識(shí)別軟件公司Lernount&Hauspie投資3000萬美元,以促進(jìn)支持服務(wù)器和臺(tái)式機(jī)的語音識(shí)別應(yīng)用軟件的發(fā)展。73一些大公司對(duì)HMM的研究及應(yīng)用HMMwasimplemeHMM研究的發(fā)展方向CombiningNNandHMM’sforspeechrecognition.TheextensionofHMMThenewtrainingalgorithmofHMMThenewapplicationfieldsofHMM74HMM研究的發(fā)展方向CombiningNNandHMM隱馬爾科夫模型
HiddenMarkovModel(HMM)隱馬爾科夫模型
HiddenMarkovModel(HMHiddenMarkovModelTheproblemsabouttheTemplatemethodHMMisapopularstatisticaltoolDiscrete-TimeMarkovProcessTheoryofHMM:Thethreebasicproblems76HiddenMarkovModelTheproblemReviewtemplatemethodKeyideaToderivetypicalsequencesofspeechframesforapatternviasomeaveragingprocedureRelyontheuseoflocalspectraldistancemeasurestocomparepatternsDynamicprogramming,temporallyalignpatternsReviewtemplatemethodKeyideaProblemsofTemplatemethod語音是一個(gè)隨機(jī)信號(hào)非嚴(yán)格意義上的統(tǒng)計(jì)方法StatisticaltechniqueshavebeenwidelyusedinclusteringtocreatereferencepatternsStatisticalsignalcharacterizationinherentinthetemplaterepresentationisonlyimplicitandofteninadequate:neglectsthesecond-orderstatistics缺乏魯棒性78ProblemsofTemplatemethod語音是HMM:populartoolThebasictheoryofHMMwaspublishedinaseriesofclassicpapersbyBaumandhiscolleaguesinthelate1960sandearly1970sHMMwasimplementedforspeech-processingapplicationsbyBakeratCMU,andbyJelinekandhiscolleaguesatIBMinthe1970sHMMprovidesanaturalandhighlyreliablewayofrecognizingspeechforawiderangeofapplications79HMM:populartoolThebasictheHMM:populartoolTheunderlyingassumptionoftheHMMthespeechsignalcanbewellcharacterizedasaparametricrandomprocesstheparametersofthestochasticprocesscanbedeterminedinaprecise,well-definedmanner80HMM:populartoolTheunderlyinDiscrete-TimeMarkovProcessAsystemwithNdiscretestatesindexedby{1,2,…N}.:Thestateattimet81Discrete-TimeMarkovProcessADiscrete-TimeMarkovProcess82時(shí)不變系統(tǒng)?Discrete-TimeMarkovProcess8時(shí)ObservableMarkovModelEachstatecorrespondstoanobservableeventExample:weatherState1:rainorsnowState2:cloudyState3:sunny83TheweatherisobservedonceadayCoulditbeusedforwhatcase?ObservableMarkovModelEachstExtensionstoHiddenMarkovModels
--TheUrn-and-BallModelNglassurns,eachwithMdistinctcolorballsAurnisrandomlyselectedfirst,andthenaballischosenatrandom,whosecolorisrecordedastheobservationTheballisthenreplacedintheurnfromwhichitwasselectedTheprocedureisrepeated84ExtensionstoHiddenMarkovMo……2.2.TheUrn-and-BallModel85……2.2.TheUrn-and-BallModel11HMMforweatherforecastWhatOperationsdoyoudesigntocarryouttheballselection?HowdoyouextendtheMarkovprocesstoHMMtogivemorepreciseweatherforecast?HMMforweatherforecastWhatOTheoryofHMMTopologyElementsBi-hiddenprocessesThreebasicproblems87TheoryofHMMTopology13HMMTopology:Ergodic88HMMTopology:Ergodic14HMMTopology:Left-right89HMMTopology:Left-right15Parallelpathleft-rightHMM90Parallelpathleft-rightHMM16ElementsofHMMN每個(gè)模型的狀態(tài)數(shù)M每個(gè)狀態(tài)的可觀察現(xiàn)象數(shù)狀態(tài)轉(zhuǎn)移概率分布,其中狀態(tài)觀察現(xiàn)象概率分布初始狀態(tài)概率分布,其中ElementsofHMMN每個(gè)模型的狀態(tài)數(shù)weusethecompactnotationToindicatethecompleteparametersetofthemodel,thisparameterset,ofcourse,definesaprobabilitymeasureforO,,whichwediscusslater,weusetheterminologyHMMtoindicatetheparametersetandtheassociatedprobabilitymeasureinterchangeablywithoutambiguity.ElementsofHMM92weusethecompactnotationElBi-HiddenprocessesThestatesTheobservations93Bi-HiddenprocessesThestates1TheThreeBasicProblemsEvaluation:ForwardprocessOptimalpath:ViterbiAlgorithmTraining:Baum-WelchAlgorithm94TheThreeBasicProblemsEvaluaProblem1:GiventheobservationsequenceO=(o1,o2…,oT),andamodelhowdoweefficientlycompute,theprobabilityoftheobservationsequence,giventhemodel?Wecanalsoviewtheproblemasoneofscoringhowwellagivenmodelmatchesagivenobservationsequence.Tosolvetheproblemallowsustochoosethemodelthatbestmatchestheobservations.Evaluation95Problem1:Evaluation21Problem2GiventheobservationsequenceO=(o1,o2,…,oT),andthemodelhowdowechooseacorrespondingstaticsequenceq=(q1q2,…,qt)hisproblemtofindthecorrectstatesequence.weusuallyuseanoptimalitycriteriontosolvethisproblemasbestaspossible.Evaluation96Problem2Evaluation22Problem3:Howdoweadjustthemodelparameterstomaximize
Inthisproblemweattempttooptimizethemodelparameterstobestdescribehowagivenobservationsequencecomesabout.TheobservationsequenceusedtoadjustthemodelparametersiscalledatrainingsequencebecauseitisusedtotraintheHMM.Evaluation97Problem3:InthisproblemweatProbabilityEvaluationWewishtocalculatetheprobabilityoftheobservationsequence.Consideronesuchfixed-statesequenceWhereq1istheinitialstate.TheprobabilityoftheobservationsequenceOgiventhestatesequenceofqisWherewehaveassumedstatisticalindependenceofobservation.Thusweget
98ProbabilityEvaluationWewishProbabilityEvaluationTheprobabilityofsuchastatesequenceqcanbewrittenasThejointprobabilityofOandqoccursimultaneously,issimplytheproductoftheabovetwoterms99ProbabilityEvaluationTheprobProbabilityEvaluationTheprobabilityofOisobtainedbysummingthisjointprobabilityoverallpossiblestatesequenceq,giving100ProbabilityEvaluationTheprobA.TheForwardProcedureConsidertheforwardvariable
definedasThatis,theprobabilityofthepartialobservationsequence,o1o2…ot,(untiltimet)andstateiattimet,giventhemodel.Wecansolveforinductively,asfollows:101A.TheForwardProcedureConsidForwardProcedure1.initialization2.induction3.termination102ForwardProcedure1.initializatB.TheBackwardProcedureInasimilarmanner,wecanconsiderabackwardvariabledefinedasThatis,theprobabilityofthepartialobservationsequencefromt+1totheend,givenstateiattimetandthemodelAgainwecansolveforinductively,asFollows:103B.TheBackwardProcedureInaBackwardProcedure1.initialization2.Induction104BackwardProcedure1.initializaBackwardprocedureTheinitializationstep1arbitrarilydefinetobe1foralli.Step2,whichisillustratedinnextfigure,whichshowsthatinordertohavebeeninstateiattimet,andtoaccountfortheobservationsequencefromtimet+1on,youhavetoconsiderallpossiblestatejattimet+1105BackwardprocedureTheinitialiaccordingforthetransitionfromitoj,aswellastheobservationot+1instatej.Andthenaccountfortheremainingpartialobservationsequencefromstatej.Wewillseealterhowthebackwardaswellastheforwardcalculationareusedtohelpsolvefundamentalproblem2and3ofHMMsBackwardprocedure106accordingforthetransitionf……ai3ai2ai1aiNs1s2s3sNt+1tsiBackwardprocedure107……ai3ai2ai1aiNs1s2s3sNt+1tsiBaThereareseveralpossiblewaysofsolvingproblem2,findingthe“optimal”statesequenceassociatedwiththegivenobservationsequence.Toimplementthisproblem2,wecandefinethataposterioriprobabilityvariableBackwardprocedure108ThereareseveralpossiblewayThatis,theprobabilityofbeinginstateiattimet,giventheobservationsequenceO,andthemodel,wecanexpressinseveralforms,includingBackwardprocedure109Thatis,theprobabilityofSinceisequaltowecanwriteasBackwardprocedure110SinceisWhereweseethataccountsforthepartialobservationsequenceandstateiatt,whileaccountfortheremainderoftheobservationsequence,givenstateUsing
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- 2024至2030年中國(guó)汽機(jī)旁路系統(tǒng)數(shù)據(jù)監(jiān)測(cè)研究報(bào)告
- 2024至2030年中國(guó)大型去膜清洗生產(chǎn)線行業(yè)投資前景及策略咨詢研究報(bào)告
- 2024至2030年水處理過濾片項(xiàng)目投資價(jià)值分析報(bào)告
- 2024年足銀鍍金碗項(xiàng)目可行性研究報(bào)告
- 2024年電腦中頻治療儀項(xiàng)目可行性研究報(bào)告
- 范文新學(xué)期計(jì)劃合集十篇
- 土地居間合同
- 多重耐藥菌醫(yī)院感染管理制度
- “三支一扶”支醫(yī)工作總結(jié)
- 簡(jiǎn)單租房協(xié)議合同(31篇)
- 《巧手工藝坊-折紙-千變?nèi)f化的帽子》(教案)-四年級(jí)上冊(cè)勞動(dòng)蘇教版1
- 法律認(rèn)可的婚內(nèi)忠誠協(xié)議
- OGSM戰(zhàn)略規(guī)劃框架:實(shí)現(xiàn)企業(yè)目標(biāo)的系統(tǒng)化方法論兩套文檔
- 人教版5年級(jí)上冊(cè)音樂測(cè)試(含答案)
- 律所訴訟方案范文
- DZ/T 0462.3-2023 礦產(chǎn)資源“三率”指標(biāo)要求 第3部分:鐵、錳、鉻、釩、鈦(正式版)
- DZ∕T 0338.2-2020 固體礦產(chǎn)資源量估算規(guī)程 第2部分 幾何法(正式版)
- 2024年上海市嘉定區(qū)高三二??荚囉⒄Z試卷
- 醫(yī)院質(zhì)控工作匯報(bào)
- 2024浙江金華義烏雙江湖集團(tuán)人員招聘68人公開引進(jìn)高層次人才和急需緊缺人才筆試參考題庫(共500題)答案詳解版
- 2024年河南省五市高三第二次聯(lián)考英語試卷(含答案)
評(píng)論
0/150
提交評(píng)論