網(wǎng)絡(luò)嵌入和圖卷積神經(jīng)網(wǎng)絡(luò)技術(shù)實(shí)踐_第1頁
網(wǎng)絡(luò)嵌入和圖卷積神經(jīng)網(wǎng)絡(luò)技術(shù)實(shí)踐_第2頁
網(wǎng)絡(luò)嵌入和圖卷積神經(jīng)網(wǎng)絡(luò)技術(shù)實(shí)踐_第3頁
網(wǎng)絡(luò)嵌入和圖卷積神經(jīng)網(wǎng)絡(luò)技術(shù)實(shí)踐_第4頁
網(wǎng)絡(luò)嵌入和圖卷積神經(jīng)網(wǎng)絡(luò)技術(shù)實(shí)踐_第5頁
已閱讀5頁,還剩27頁未讀, 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

1、網(wǎng)絡(luò)嵌入和圖卷積神經(jīng)網(wǎng)絡(luò)技術(shù)實(shí)踐技術(shù)創(chuàng)新,變革未來網(wǎng)絡(luò)/圖數(shù)據(jù)圖是對(duì)于數(shù)據(jù)的一/通用、全面、復(fù)雜的表示形式網(wǎng)絡(luò)無處不在社交網(wǎng)絡(luò)生物網(wǎng)絡(luò)金融網(wǎng)絡(luò)物聯(lián)網(wǎng)信息網(wǎng)絡(luò)物流網(wǎng)絡(luò)為什么網(wǎng)絡(luò)很重要?我們很少只關(guān)心數(shù)據(jù)本身,而不關(guān)心數(shù)據(jù)之間的關(guān)聯(lián)Reflected by relational subjectsDecided by relational subjectsTargetImage CharacterizationTargetSocial Capital網(wǎng)絡(luò)數(shù)據(jù)對(duì)機(jī)器學(xué)習(xí)模型 不友好G = ( V, E )LinksTopologyInapplicability of ML methodsNetwork

2、 DataFeature ExtractionPattern DiscoveryPipeline for network analysisNetwork ApplicationsLearnabilityLearning from NetworksNetwork EmbeddingGNNG = ( V, E )G = ( V )Vector SpacegenerateembedEasy to parallelCan apply classical ML methods網(wǎng)絡(luò)嵌入 (Network Embedding)網(wǎng)絡(luò)嵌入的目標(biāo)GoalSupport network inference in v

3、ector spaceReflect network structureMaintain network propertiesBACTransitivityBasic idea: recursive definition of statesA simple example: PageRank圖神經(jīng)網(wǎng)絡(luò)GNNF. Scarselli, et al. The graph neural network model. IEEE TNN, 2009.定義在圖拓?fù)渖系膶W(xué)習(xí)框架Main idea: pass messages between pairs of nodes & agglomerateStack

4、ing multiple layers like standard CNNs:State-of-the-art results on node classification圖卷積神經(jīng)網(wǎng)絡(luò)GCNT. N. Kipf and M. Welling. Semi-supervised classification with graph convolutional networks. ICLR, 2017.圖神經(jīng)網(wǎng)絡(luò)GNN簡(jiǎn)史網(wǎng)絡(luò)嵌入與圖神經(jīng)網(wǎng)絡(luò)GraphFeatureNetwork EmbeddingInputModelOutputEmbeddingTask resultsFeatureTopolog

5、y to VectorGCNTask resultsFusion of Topology and FeaturesUnsupervised vs. (Semi-)Supervised圖卷積網(wǎng)絡(luò) v . 網(wǎng)絡(luò)嵌入In some sense, they are different.Graphs exist in mathematics. (Data Structure)Mathematical structures used to model pairwise relations between objectsNetworks exist in the real world. (Data)Soci

6、al networks, logistic networks, biology networks, transaction networks, etc.A network can be represented by a graph.A dataset that is not a network can also be represented by a graph.圖卷積網(wǎng)絡(luò)應(yīng)用于自然語言處理Many papers on BERT + GNN.BERT is for retrieval.It creates an initial graph of relevant entities and th

7、e initial evidence.GNN is for reasoning.It collects evidence (i.e., old messages on the entities) and arrive at new conclusions (i.e., new messages on the entities), by passing the messages around and aggregating them.Cognitive Graph for Multi-Hop Reading Comprehension at Scale. Ding et al., ACL 201

8、9.Dynamically Fused Graph Network for Multi-hop Reasoning. Xiao et al., ACL 2019.圖卷積網(wǎng)絡(luò)應(yīng)用于計(jì)算機(jī)視覺A popular trend in CV is to construct a graph during the learning process.To process multiple objects or parts in a scene, and to infer their relationships.Example: Scene graphs.Scene Graph Generation by It

9、erative Message Passing. Xu et al., CVPR 2017.Image Generation from Scene Graphs. Johnson et al., CVPR 2018.圖卷積網(wǎng)絡(luò)應(yīng)用于符號(hào)推理We can view the process of symbolic reasoning as a directed acyclic graph.Many recent efforts use GNNs to perform symbolic reasoning.Learning by Abstraction: The Neural State Machi

10、ne. Hudson & Manning, 2019.Can Graph Neural Networks Help Logic Reasoning? Zhang et al., 2019.Symbolic Graph Reasoning Meets Convolutions. Liang et al., NeurIPS 2018.Structural equation modeling, a form of causal modeling, tries to describe the relationships between the variables as a directed acycl

11、ic graph (DAG).GNN can be used to represent a nonlinear structural equation and help find the DAG, after treating the adjacency matrix as parameters.圖卷積網(wǎng)絡(luò)應(yīng)用于結(jié)構(gòu)方程建模DAG-GNN: DAG Structure Learning with Graph Neural Networks. Yu et al., ICML 2019.(大多數(shù))圖卷積網(wǎng)絡(luò)方法的PipelineCo-occurrence (neighborhood)網(wǎng)絡(luò)嵌 : 拓

12、撲向量化High-order proximities網(wǎng)絡(luò)嵌 : 拓?fù)湎蛄炕疌ommunities網(wǎng)絡(luò)嵌 : 拓?fù)湎蛄炕疕eterogeneous networks網(wǎng)絡(luò)嵌 : 拓?fù)湎蛄炕ù蠖鄶?shù))網(wǎng)絡(luò)嵌入方法的PipelineLearning for Networks v.s. Learning via GraphsLearning for networksLearning Via GraphsNetwork EmbeddingGCN網(wǎng)絡(luò)嵌入方法解決的核心問題Node NeighborhoodCommunityPair-wise ProximityHyper EdgesGlobal Struct

13、ureReducing representation dimensionality while preserving necessary topologicalstructures and properties.Nodes & LinksNon-transitivityAsymmetric TransitivityDynamicUncertaintyHeterogeneityInterpretabilityTopology-driven圖卷積神經(jīng)網(wǎng)絡(luò)方法解決的核心問題Fusing topologyand featuresin the wayof smoothing features with

14、the assistance of topology.Feature-driven如果問題是拓?fù)潋?qū)動(dòng)的?Since GCN is filtering features, it is inevitably feature-drivenStructure only provides auxiliary information (e.g. for filtering/smoothing)When feature plays the key role, GNN performs good How about the contrary?Synthesis data: stochastic block m

15、odel + random featuresMethodResultsRandom10.0GCN18.31.1DeepWalk99.00.1網(wǎng)絡(luò)嵌入 v . 圖神經(jīng)網(wǎng)絡(luò)There is no better one, but there is more proper one.反思:圖神經(jīng)網(wǎng)絡(luò)是否真的是深度學(xué)習(xí)方法?Recall GNN formulation:= &!(!#$, = +*,$/.0/+*,$/.How about removing the non-linear component:!#$= !(Stacking multiple layers and add softmax classification:21= 3456789!:= 3456789 !($ (:,$= 3456789:!(High-order proximityWu, Felix, et al. Simplifying graph convolutional networks. ICML, 2019.Thissimplified GNN (SGC) showsremarkable results:Node classificationText Classification反思:圖神經(jīng)網(wǎng)絡(luò)是否真的是深度學(xué)習(xí)方法?Wu

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

評(píng)論

0/150

提交評(píng)論