Segmentationusingeigenvectors分割使用特征向量PPT學(xué)習(xí)課件_第1頁(yè)
Segmentationusingeigenvectors分割使用特征向量PPT學(xué)習(xí)課件_第2頁(yè)
Segmentationusingeigenvectors分割使用特征向量PPT學(xué)習(xí)課件_第3頁(yè)
Segmentationusingeigenvectors分割使用特征向量PPT學(xué)習(xí)課件_第4頁(yè)
Segmentationusingeigenvectors分割使用特征向量PPT學(xué)習(xí)課件_第5頁(yè)
已閱讀5頁(yè),還剩54頁(yè)未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

1、Image Segmentation,Image segmentation,How do you pick the right segmentation?,Bottom up segmentation: - Tokens belong together because they are locally coherent. Top down segmentation: - Tokens grouped because they lie on the same object.,“Correct” segmentation,There may not be a single correct answ

2、er. Partitioning is inherently hierarchical. One approach we will use in this presentation: “Use the low-level coherence of brightness, color, texture or motion attributes to come up with partitions”,Outline,Introduction Graph terminology and representation. “Min cuts” and “Normalized cuts”. Other s

3、egmentation methods using eigenvectors. Conclusions.,Outline,Introduction Graph terminology and representation. “Min cuts” and “Normalized cuts”. Other segmentation methods using eigenvectors. Conclusions.,Graph-based Image Segmentation,Image (I),Graph Affinities (W),Intensity Color Edges Texture,Sl

4、ide from Timothee Cour (/timothee),Graph-based Image Segmentation,Image (I),Slide from Timothee Cour (/timothee),Graph Affinities (W),Intensity Color Edges Texture,Graph-based Image Segmentation,Image (I),Eigenvector X(W),Slide from Timothee Cour (http

5、://timothee),Graph Affinities (W),Intensity Color Edges Texture,Graph-based Image Segmentation,Image (I),Eigenvector X(W),Discretization,Slide from Timothee Cour (/timothee),Graph Affinities (W),Intensity Color Edges Texture,Outline,Introduction Graph termin

6、ology and representation. “Min cuts” and “Normalized cuts”. Other segmentation methods using eigenvectors. Conclusions.,Graph-based Image Segmentation,V: graph nodes E: edges connection nodes,G = V,E,Pixels Pixel similarity,Slides from Jianbo Shi,Graph terminology,Similarity matrix:,Slides from Jian

7、bo Shi,Affinity matrix,Similarity of image pixels to selected pixel Brighter means more similar,Reshape,N*M pixels,N*M pixels,M pixels,N pixels,Warning the size of W is quadratic with the number of parameters!,Graph terminology,Degree of node:,Slides from Jianbo Shi,Graph terminology,Volume of set:,

8、Slides from Jianbo Shi,Graph terminology,Slides from Jianbo Shi,Cuts in a graph:,Representation,Partition matrix X: Pair-wise similarity matrix W: Degree matrix D: Laplacian matrix L:,Pixel similarity functions,Intensity,Texture,Distance,Pixel similarity functions,Intensity,Texture,Distance,here c(x

9、) is a vector of filter outputs. A natural thing to do is to square the outputs of a range of different filters at different scales and orientations, smooth the result, and rack these into a vector.,Definitions,Methods that use the spectrum of the affinity matrix to cluster are known as spectral clu

10、stering. Normalized cuts, Average cuts, Average association make use of the eigenvectors of the affinity matrix. Why these methods work?,Spectral Clustering,Data,Similarities,* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University,Eigenvectors and blocks,Block

11、matrices have block eigenvectors: Near-block matrices have near-block eigenvectors:,eigensolver,1= 2,2= 2,3= 0,4= 0,eigensolver,1= 2.02,2= 2.02,3= -0.02,4= -0.02,* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University,Spectral Space,Can put items into blocks by

12、 eigenvectors: Clusters clear regardless of row ordering:,e1,e2,e1,e2,e1,e2,e1,e2,* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University,Outline,Introduction Graph terminology and representation. “Min cuts” and “Normalized cuts”. Other segmentation methods usi

13、ng eigenvectors. Conclusions.,How do we extract a good cluster?,Simplest idea: we want a vector x giving the association between each element and a cluster We want elements within this cluster to, on the whole, have strong affinity with one another We could maximize But need the constraint This is a

14、n eigenvalue problem - choose the eigenvector of W with largest eigenvalue.,Criterion for partition:,Minimum cut,First proposed by Wu and Leahy,A,B,Problem! Weight of cut is directly proportional to the number of edges in the cut.,Normalized Cut,Normalized cut or balanced cut:,Finds better cut,Norma

15、lized Cut,Volume of set (or association):,A,B,Normalized Cut,Volume of set (or association): Define normalized cut: “a fraction of the total edge connections to all the nodes in the graph”:,A,B,A,B,Define normalized association: “how tightly on average nodes within the cluster are connected to each

16、other”,A,B,Subject to:,Observations(I),Maximizing Nassoc is the same as minimizing Ncut, since they are related: How to minimize Ncut? Transform Ncut equation to a matricial form. After simplifying:,Rayleigh quotient,NP-Hard! ys values are quantized,Instead, relax into the continuous domain by solvi

17、ng generalized eigenvalue system: Which gives: Note that so, the first eigenvector is y0=1 with eigenvalue 0. The second smallest eigenvector is the real valued solution to this problem!,Observations(II),min,Algorithm,Define a similarity function between 2 nodes. i.e.: Compute affinity matrix (W) an

18、d degree matrix (D). Solve Use the eigenvector with the second smallest eigenvalue to bipartition the graph. Decide if re-partition current partitions. Note: since precision requirements are low, W is very sparse and only few eigenvectors are required, the eigenvectors can be extracted very fast usi

19、ng Lanczos algorithm.,Discretization,Sometimes there is not a clear threshold to binarize since eigenvectors take on continuous values. How to choose the splitting point? Pick a constant value (0, or 0.5). Pick the median value as splitting point. Look for the splitting point that has the minimum Nc

20、ut value: Choose n possible splitting points. Compute Ncut value. Pick minimum.,Use k-eigenvectors,Recursive 2-way Ncut is slow. We can use more eigenvectors to re-partition the graph, however: Not all eigenvectors are useful for partition (degree of smoothness). Procedure: compute k-means with a hi

21、gh k. Then follow one of these procedures: Merge segments that minimize k-way Ncut criterion. Use the k segments and find the partitions there using exhaustive search. Compute Q (next slides).,e1,e2,e1,e2,Toy examples,Images from Matthew Brand (TR-2002-42),Example (I),Eigenvectors,Segments,Example (

22、II),* Slide from Khurram Hassan-Shafique CAP5415 Computer Vision 2003,Segments,Original,Outline,Introduction Graph terminology and representation. “Min cuts” and “Normalized cuts”. Other segmentation methods using eigenvectors. Conclusions.,Other methods,Average association Use the eigenvector of W

23、associated to the biggest eigenvalue for partitioning. Tries to maximize: Has a bias to find tight clusters. Useful for gaussian distributions.,A,B,Other methods,Average cut Tries to minimize: Very similar to normalized cuts. We cannot ensure that partitions will have a a tight within-group similari

24、ty since this equation does not have the nice properties of the equation of normalized cuts.,Other methods,Other methods,20 points are randomly distributed from 0.0 to 0.5 12 points are randomly distributed from 0.65 to 1.0,Normalized cut,Average cut,Average association,Other methods,Scott and Longu

25、et-Higgins (1990). V contains the first eigenvectors of W. Normalize V by rows. Compute Q=VTV Values close to 1 belong to the same cluster.,Second ev,First ev,Q,W,Data,Other applications,Costeira and Kanade (1995). Used to segment points in motion. Compute M=(XY). The affinity matrix W is compute as

26、 W=MTM. This trick computes the affinity of every pair of points as a inner product. Compute Q=VTV Values close to 1 belong to the same cluster.,Data,M,Q,Other applications,Face clustering in meetings. Grab faces from video in real time (use a face detector + face tracker). Compare all faces using a

27、 distance metric (i.e. projection error into representative basis). Use normalized cuts to find best clustering.,Outline,Introduction Graph terminology and representation. “Min cuts” and “Normalized cuts”. Other segmentation methods using eigenvectors. Conclusions.,Conclusions,Good news: Simple and

28、powerful methods to segment images. Flexible and easy to apply to other clustering problems. Bad news: High memory requirements (use sparse matrices). Very dependant on the scale factor for a specific problem.,Thank you!,The End!,Examples,Spectral Clutering,Images from Matthew Brand (TR-2002-42),Spe

29、ctral clustering,Makes use of the spectrum of the similarity matrix of the data to cluster the points.,Solve clustering for affinity matrix,w(i,j) distance node i to node j,Graph terminology,Similarity matrix:,Degree of node:,Volume of set:,Graph cuts:,Outline,Introduction Graph terminology and repr

30、esentation. “Min cuts” and “Normalized cuts”. Other segmentation methods using eigenvectors. Conclusions.,Graph terminology,Volume of set:,Slides from Jianbo Shi,Spectral Space,Can put items into blocks by eigenvectors: Clusters clear regardless of row ordering:,e1,e2,e1,e2,e1,e2,e1,e2,* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University,Discretization,Sometimes there is not a clear threshold to binarize since eigenv

溫馨提示

  • 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

評(píng)論

0/150

提交評(píng)論