版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
JiafengGuoUnsupervisedLearning——ClusteringOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringSupervisedvs.UnsupervisedLearning
WhydoUnsupervisedLearning?Rawdatacheap.Labeleddataexpensive.Savememory/computation.Reducenoiseinhigh-dimensionaldata.Usefulinexploratorydataanalysis.Oftenapre-processingstepforsupervisedlearning.Discovergroupssuchthatsampleswithinagrouparemoresimilartoeachotherthansamplesacrossgroups.ClusterAnalysisAvariablecanbeunobserved(latent).Itisanimaginaryquantitymeanttoprovidesomesimplifiedandabstractiveviewofthedatagenerationprocess.E.g.,speechrecognitionmodels,mixturemodelsItisareal-worldobjectand/orphenomena,butdifficultorimpossibletomeasure.E.g.,thetemperatureofastar,causesofadisease,evolutionaryancestorsItisareal-worldobjectand/orphenomena,butsometimeswasnotmeasured,becauseoffaultysensors;orwasmeasurewithanoisychannel,etc.E.g.,trafficradio,aircraftsignalonaradarscreenDiscretelatentvariablescanbeusedtopartition/clusterdataintosub-groups.Continuouslatentvariablescanbeusedfordimensionalityreduction.UnobservedVariablesOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringImageSegmentation/pff/segmentHumanPopulationEranElhaiketal.NatureClusteringGraphsNewman,2008VectorquantizationtocompressimagesBishop,PRMLAdissimilarity/distancefunctionbetweensamples.Alossfunctiontoevaluateclusters.Algorithmthatoptimizesthislossfunction.IngredientsofclusteranalysisOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringChoiceofdissimilarity/distancefunctionisapplicationdependent.Needtoconsiderthetypeoffeatures.Categorical,ordinalorquantitative.Possibletolearndissimilarityfromdata.Dissimilarity/DistanceFunction
DistanceFunction
StandardizationWithoutstandardizationWith
standardizationStandardizationnotalwayshelpfulWithoutstandardizationWith
standardizationOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringPerformanceEvaluationofClustering:ValidityindexEvaluationmetrics:referencemodel(externalindex)comparewithreferencenon-referencemodel(internalindex)measuredistanceofinner-classandinter-classEvaluationofClustering
ReferenceModelm(m-1)/2referencesamenotclusteringsameabnotcd
ExternalIndexOnlyhavingresultofclustering,howcanweevaluateit?Intra-clustersimilarity:largerisbetterInter-clustersimilarity:smallerisbetterNon-referencemodel
Non-referencemodel
InternalIndex
OutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringK-means:
Idea
HowdoweminimizeJw.r.t(rik,uk)?ChickenandeggproblemIfprototypesknown,canassignresponsibilitiesIfresponsibilitiesknown,cancomputeprototypesWeuseaniterativeprocedureK-means:minimizingthelossfunction
K-meansAlgorithmsSomeheuristicsRandomlypickKdatapointsasprototypesPickprototypei+1tobethefarthestfromprototypes{1,2….i}HowdoweinitializeK-means?Evolutionofk-Means(a)originaldataset;(b)randominitialization;(c-f)illustrationofrunningtwoiterationsofk-means.(ImagesfromMichaelJordan)LossfunctionJaftereachiterationk-meansisexactlycoordinatedescentonthereconstructionerrorE.Emonotonicallydecreases,andthevalueofEconverges,sodotheclusteringresults.Itispossiblefork-meanstooscillatebetweenafewdifferentclusterings,butthisalmostneverhappensinpractice.Eisnon-convex,socoordinatedescentonEcannotguaranteedtoconvergetoglobalminimum.Onecommonthingtodoisrunningk-meansmanytimesandpickthebestone.ConvergenceofK-meansLikechoosingKinkNN.ThelossfunctionJgenerallydecreaseswithK.HowtochooseK?HowtochooseK?GapstatisticCross-validation:Partitiondataintotwosets.Estimateprototypesononeandusethesetocomputethelossfunctionontheother.Stabilityofclusters:Measurethechangeintheclustersobtainedbyresamplingorsplittingthedata.Non-parametricapproach:PlaceaprioronK.MoredetailsintheBayesiannon-parametriclecture.Hardassignmentsofdatapointstoclusterscancauseasmallperturbationtoadatapointtoflipittoanothercluster.Solution:GMMAssumessphericalclustersandequalprobabilitiesforeachcluster.Solution:GMMClusterschangearbitrarilyfordifferentK.Solution:HierarchicalclusteringSensitivetooutliers.Solution:Usearobustlossfunction.Workspoorlyonnon-convexclusters.Solution:Spectralclustering.LimitationsofK-meansOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringMultivariateNormalDistribution
GaussianMixtureModel
TheLearningisHard
HowtoSolveit?
TheExpectation-Maximization(EM)AlgorithmAverygeneraltreatmentoftheEMalgorithm,andIntheprocessprovideaproofthattheEMalgorithmderivedheuristicallybeforeforGaussianmixturesdoesindeedmaximizethelikelihoodfunction,andThisdiscussionwillalsoformthebasisforthederivationofthevariationalinferenceframeworkTheEMAlgorithminGeneral
TheEMAlgorithminGeneralTheEMAlgorithminGeneral
Maximizingoverq(Z)wouldgivethetrueposteriorEM:VariationalViewpointEStepMStep
TheEMAlgorithm
InitialConfiguratinE-StepM-StepTheEMAlgorithmTheEMAlgorithm
Convergence
GMM:RelationtoK-meansIllustrationK-meansvsGMMLossfunction:minimizesumofsquareddistance.Hardassignmentofpointstoclusters.Assumessphericalclusterswithequalprobabilityofacluster.Minimizenegativeloglikelihood.Softassignmentofpointstoclusters.Canbeusedfornon-sphericalclusterswithdifferentprobabilities.K-meansGMMOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringSquaredEuclideandistancelossfunctionofK-meansnotrobust.Onlythedissimilaritymatrixmaybegiven.Attributesnotquantitative.K-medoids
K-medoidsOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringOrganizetheclustersinahierarchicalway.Producesarootedbinarytree(dendrogram).HierarchicalClusteringHierarchicalClusteringBottom-up(agglomerative):Recursivelymergetwogroupswiththesmallestbetween-clustersimilarity.Top-down(divisive):Recursivelysplitaleast-coherent(e.g.largestdiameter)cluster.Userscanthenchooseacutthroughthehierarchytorepresentthemostnaturaldivisionintoclusters(e.g.whereintergroupsimilarityexceedssomethreshold).
HierarchicalClusteringOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClustering
DBSCAN1Esteretal.Adensity-basedalgorithmfordiscoveringclustersinlargespatialdatabaseswithnoise.ProceedingsoftheSecondInternationalConferenceonKnowledgeDiscoveryandDataMining(KDD).1996.Twopointspandqaredensity-connectedifthereisapointosuchthatbothpandqarereachablefromoAclustersatisfiestwoproperties:Allpointswithintheclusteraremutuallydensity-connected;Ifapointisdensity-reachablefromanypointofthecluster,itispartoftheclusteraswellDBSCAN
DBSCANAdvantagesNotneedtospecifythenumberofclustersArbitraryshapeclusterRobusttooutliersDisadvantagesDifficultparameterselectionNotproperfordatasetswithlargedifferencesindensitiesAnalysisofDBSCAN
Mean-ShiftClustering2Fukunaga,Keinosuke;LarryD.Hostetler.TheEstimationoftheGradientofaDensityFunction,withApplicationsinPatternRecognition.IEEETransactionsonInformationTheory21(1):32–40.Jan.1975.Cheng,Yizong.Me
溫馨提示
- 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- 二零二五版電力工程設(shè)計(jì)咨詢合同2篇
- 二零二五年度高新技術(shù)企業(yè)承包商擔(dān)保合同3篇
- 二零二五版戶外用品促銷員活動(dòng)策劃合同2篇
- 二零二五年度酒店前臺(tái)正規(guī)雇傭合同范本(含勞動(dòng)合同變更及續(xù)簽規(guī)則)3篇
- 二零二五版港口安全評(píng)價(jià)與安全管理合同3篇
- 二零二五版環(huán)保工程保險(xiǎn)合同3篇
- 二零二五版外資企業(yè)往來(lái)借款稅務(wù)籌劃合同3篇
- 二零二五年財(cái)務(wù)顧問(wèn)企業(yè)財(cái)務(wù)管理咨詢合同3篇
- 二零二五版智能家居產(chǎn)品銷售安裝合同2篇
- 二零二五年度鋼筋行業(yè)購(gòu)銷合同規(guī)范范本5篇
- 《阻燃材料與技術(shù)》課件 第8講 阻燃木質(zhì)材料
- 低空經(jīng)濟(jì)的社會(huì)接受度與倫理問(wèn)題分析
- JGJ120-2012建筑基坑支護(hù)技術(shù)規(guī)程-20220807013156
- 英語(yǔ)代詞專項(xiàng)訓(xùn)練100(附答案)含解析
- GB/T 4732.1-2024壓力容器分析設(shè)計(jì)第1部分:通用要求
- 《采礦工程英語(yǔ)》課件
- NB-T31045-2013風(fēng)電場(chǎng)運(yùn)行指標(biāo)與評(píng)價(jià)導(dǎo)則
- NB-T+10488-2021水電工程砂石加工系統(tǒng)設(shè)計(jì)規(guī)范
- 天津市和平區(qū)2023-2024學(xué)年七年級(jí)下學(xué)期6月期末歷史試題
- 微型消防站消防員培訓(xùn)內(nèi)容
- (完整版)鋼筋加工棚驗(yàn)算
評(píng)論
0/150
提交評(píng)論