




版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進行舉報或認領
文檔簡介
JiafengGuoUnsupervisedLearning——ClusteringOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringSupervisedvs.UnsupervisedLearning
WhydoUnsupervisedLearning?Rawdatacheap.Labeleddataexpensive.Savememory/computation.Reducenoiseinhigh-dimensionaldata.Usefulinexploratorydataanalysis.Oftenapre-processingstepforsupervisedlearning.Discovergroupssuchthatsampleswithinagrouparemoresimilartoeachotherthansamplesacrossgroups.ClusterAnalysisAvariablecanbeunobserved(latent).Itisanimaginaryquantitymeanttoprovidesomesimplifiedandabstractiveviewofthedatagenerationprocess.E.g.,speechrecognitionmodels,mixturemodelsItisareal-worldobjectand/orphenomena,butdifficultorimpossibletomeasure.E.g.,thetemperatureofastar,causesofadisease,evolutionaryancestorsItisareal-worldobjectand/orphenomena,butsometimeswasnotmeasured,becauseoffaultysensors;orwasmeasurewithanoisychannel,etc.E.g.,trafficradio,aircraftsignalonaradarscreenDiscretelatentvariablescanbeusedtopartition/clusterdataintosub-groups.Continuouslatentvariablescanbeusedfordimensionalityreduction.UnobservedVariablesOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringImageSegmentation/pff/segmentHumanPopulationEranElhaiketal.NatureClusteringGraphsNewman,2008VectorquantizationtocompressimagesBishop,PRMLAdissimilarity/distancefunctionbetweensamples.Alossfunctiontoevaluateclusters.Algorithmthatoptimizesthislossfunction.IngredientsofclusteranalysisOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringChoiceofdissimilarity/distancefunctionisapplicationdependent.Needtoconsiderthetypeoffeatures.Categorical,ordinalorquantitative.Possibletolearndissimilarityfromdata.Dissimilarity/DistanceFunction
DistanceFunction
StandardizationWithoutstandardizationWith
standardizationStandardizationnotalwayshelpfulWithoutstandardizationWith
standardizationOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringPerformanceEvaluationofClustering:ValidityindexEvaluationmetrics:referencemodel(externalindex)comparewithreferencenon-referencemodel(internalindex)measuredistanceofinner-classandinter-classEvaluationofClustering
ReferenceModelm(m-1)/2referencesamenotclusteringsameabnotcd
ExternalIndexOnlyhavingresultofclustering,howcanweevaluateit?Intra-clustersimilarity:largerisbetterInter-clustersimilarity:smallerisbetterNon-referencemodel
Non-referencemodel
InternalIndex
OutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringK-means:
Idea
HowdoweminimizeJw.r.t(rik,uk)?ChickenandeggproblemIfprototypesknown,canassignresponsibilitiesIfresponsibilitiesknown,cancomputeprototypesWeuseaniterativeprocedureK-means:minimizingthelossfunction
K-meansAlgorithmsSomeheuristicsRandomlypickKdatapointsasprototypesPickprototypei+1tobethefarthestfromprototypes{1,2….i}HowdoweinitializeK-means?Evolutionofk-Means(a)originaldataset;(b)randominitialization;(c-f)illustrationofrunningtwoiterationsofk-means.(ImagesfromMichaelJordan)LossfunctionJaftereachiterationk-meansisexactlycoordinatedescentonthereconstructionerrorE.Emonotonicallydecreases,andthevalueofEconverges,sodotheclusteringresults.Itispossiblefork-meanstooscillatebetweenafewdifferentclusterings,butthisalmostneverhappensinpractice.Eisnon-convex,socoordinatedescentonEcannotguaranteedtoconvergetoglobalminimum.Onecommonthingtodoisrunningk-meansmanytimesandpickthebestone.ConvergenceofK-meansLikechoosingKinkNN.ThelossfunctionJgenerallydecreaseswithK.HowtochooseK?HowtochooseK?GapstatisticCross-validation:Partitiondataintotwosets.Estimateprototypesononeandusethesetocomputethelossfunctionontheother.Stabilityofclusters:Measurethechangeintheclustersobtainedbyresamplingorsplittingthedata.Non-parametricapproach:PlaceaprioronK.MoredetailsintheBayesiannon-parametriclecture.Hardassignmentsofdatapointstoclusterscancauseasmallperturbationtoadatapointtoflipittoanothercluster.Solution:GMMAssumessphericalclustersandequalprobabilitiesforeachcluster.Solution:GMMClusterschangearbitrarilyfordifferentK.Solution:HierarchicalclusteringSensitivetooutliers.Solution:Usearobustlossfunction.Workspoorlyonnon-convexclusters.Solution:Spectralclustering.LimitationsofK-meansOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringMultivariateNormalDistribution
GaussianMixtureModel
TheLearningisHard
HowtoSolveit?
TheExpectation-Maximization(EM)AlgorithmAverygeneraltreatmentoftheEMalgorithm,andIntheprocessprovideaproofthattheEMalgorithmderivedheuristicallybeforeforGaussianmixturesdoesindeedmaximizethelikelihoodfunction,andThisdiscussionwillalsoformthebasisforthederivationofthevariationalinferenceframeworkTheEMAlgorithminGeneral
TheEMAlgorithminGeneralTheEMAlgorithminGeneral
Maximizingoverq(Z)wouldgivethetrueposteriorEM:VariationalViewpointEStepMStep
TheEMAlgorithm
InitialConfiguratinE-StepM-StepTheEMAlgorithmTheEMAlgorithm
Convergence
GMM:RelationtoK-meansIllustrationK-meansvsGMMLossfunction:minimizesumofsquareddistance.Hardassignmentofpointstoclusters.Assumessphericalclusterswithequalprobabilityofacluster.Minimizenegativeloglikelihood.Softassignmentofpointstoclusters.Canbeusedfornon-sphericalclusterswithdifferentprobabilities.K-meansGMMOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringSquaredEuclideandistancelossfunctionofK-meansnotrobust.Onlythedissimilaritymatrixmaybegiven.Attributesnotquantitative.K-medoids
K-medoidsOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClusteringOrganizetheclustersinahierarchicalway.Producesarootedbinarytree(dendrogram).HierarchicalClusteringHierarchicalClusteringBottom-up(agglomerative):Recursivelymergetwogroupswiththesmallestbetween-clustersimilarity.Top-down(divisive):Recursivelysplitaleast-coherent(e.g.largestdiameter)cluster.Userscanthenchooseacutthroughthehierarchytorepresentthemostnaturaldivisionintoclusters(e.g.whereintergroupsimilarityexceedssomethreshold).
HierarchicalClusteringOutlineIntroductionApplicationsofClusteringDistanceFunctionsEvaluationMetricsClusteringAlgorithmsK-meansGaussianMixtureModelsandEMAlgorithmK-medoidsHierarchicalClusteringDensity-basedClustering
DBSCAN1Esteretal.Adensity-basedalgorithmfordiscoveringclustersinlargespatialdatabaseswithnoise.ProceedingsoftheSecondInternationalConferenceonKnowledgeDiscoveryandDataMining(KDD).1996.Twopointspandqaredensity-connectedifthereisapointosuchthatbothpandqarereachablefromoAclustersatisfiestwoproperties:Allpointswithintheclusteraremutuallydensity-connected;Ifapointisdensity-reachablefromanypointofthecluster,itispartoftheclusteraswellDBSCAN
DBSCANAdvantagesNotneedtospecifythenumberofclustersArbitraryshapeclusterRobusttooutliersDisadvantagesDifficultparameterselectionNotproperfordatasetswithlargedifferencesindensitiesAnalysisofDBSCAN
Mean-ShiftClustering2Fukunaga,Keinosuke;LarryD.Hostetler.TheEstimationoftheGradientofaDensityFunction,withApplicationsinPatternRecognition.IEEETransactionsonInformationTheory21(1):32–40.Jan.1975.Cheng,Yizong.Me
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫網(wǎng)僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責。
- 6. 下載文件中如有侵權(quán)或不適當內(nèi)容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 自貢勞動合同協(xié)議
- 桑拿沐足合同協(xié)議
- 消毒滅蟲合同協(xié)議
- 資產(chǎn)解除合同協(xié)議
- 疏散通道合同協(xié)議
- 飲食勞務合同協(xié)議
- 租證免責合同協(xié)議
- 租賃京牌協(xié)議合同
- 珠寶代銷合同協(xié)議
- 收集尿液合同協(xié)議
- 鸚鵡可行性研究報告
- 泌尿外科護理新進展
- 環(huán)衛(wèi)保潔服務應急預案
- 2025年廣東佛山市高三一模高考政治試卷試題(含答案詳解)
- 2024年中國藏語系高級佛學院招聘考試真題
- 二年級下冊道德與法治第8課《安全地玩》說課稿說課稿(第一課時)
- DBJ33T 1271-2022 建筑施工高處作業(yè)吊籃安全技術(shù)規(guī)程
- 2024年國家公務員考試行測真題附解析答案
- 2023年吉林省松原市中考物理一模試卷
- 學校聘用教師勞動合同書5篇
- 2024年07月山東興業(yè)銀行濟南分行濟南管理部招考筆試歷年參考題庫附帶答案詳解
評論
0/150
提交評論