![圖神經(jīng)網(wǎng)絡(luò)及其應(yīng)用正式版_第1頁(yè)](http://file4.renrendoc.com/view/cb9cbbc78703dbd152bb0c6a7fac050c/cb9cbbc78703dbd152bb0c6a7fac050c1.gif)
![圖神經(jīng)網(wǎng)絡(luò)及其應(yīng)用正式版_第2頁(yè)](http://file4.renrendoc.com/view/cb9cbbc78703dbd152bb0c6a7fac050c/cb9cbbc78703dbd152bb0c6a7fac050c2.gif)
![圖神經(jīng)網(wǎng)絡(luò)及其應(yīng)用正式版_第3頁(yè)](http://file4.renrendoc.com/view/cb9cbbc78703dbd152bb0c6a7fac050c/cb9cbbc78703dbd152bb0c6a7fac050c3.gif)
![圖神經(jīng)網(wǎng)絡(luò)及其應(yīng)用正式版_第4頁(yè)](http://file4.renrendoc.com/view/cb9cbbc78703dbd152bb0c6a7fac050c/cb9cbbc78703dbd152bb0c6a7fac050c4.gif)
![圖神經(jīng)網(wǎng)絡(luò)及其應(yīng)用正式版_第5頁(yè)](http://file4.renrendoc.com/view/cb9cbbc78703dbd152bb0c6a7fac050c/cb9cbbc78703dbd152bb0c6a7fac050c5.gif)
版權(quán)說(shuō)明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)
文檔簡(jiǎn)介
微信掃碼
海量資源到手進(jìn)群即領(lǐng)福利《報(bào)告與資源合編》,內(nèi)有近百行業(yè)、上萬(wàn)份行研、管理及其他學(xué)習(xí)資源免費(fèi)下載;每日分享最新6+份精選行研資料;群友信息交流,群主免費(fèi)提供相關(guān)行業(yè)報(bào)告。報(bào)告整理于網(wǎng)絡(luò),只用于群友學(xué)習(xí),請(qǐng)勿他用知識(shí)星球行業(yè)與管理資源無(wú)限制下載各行業(yè)研究報(bào)告、咨詢公司管理方案,企業(yè)運(yùn)營(yíng)制度、科技方案及大咖報(bào)告等。每月更新超過(guò)3000份最新行業(yè)資源;涵蓋科技、金融、教育、互聯(lián)網(wǎng)、房地產(chǎn)、生物制藥、醫(yī)療健康等行研報(bào)告、科技動(dòng)態(tài)、管理方案;微信掃碼
學(xué)習(xí)工作無(wú)憂微信掃碼或添加客服微信(微信號(hào):Teamkon3)免費(fèi)報(bào)告等你領(lǐng)。(添加好友請(qǐng)備注:姓名+單位+業(yè)務(wù)領(lǐng)域)行業(yè)與管理資源微信群業(yè)務(wù)合作聯(lián)系微信:Teamkon3HuaweiShenshenhuawei@
InstituteofComputingTechnology,ChineseAcademyofSciences2021.1.14GraphNeuralNetworksOutlineBriefreviewongraphneuralnetworkSeveralresearchdirectionsonExpressivepowerOver-smoothproblemApplicationsofgraphneuralnetworks4AbriefreviewongraphneuralnetworkConvolutionalNeuralNetworkConvolutionalneuralnetwork(CNN)gainsgreatsuccessonEuclideandata,e.g.,image,text,audio,andvideoImageclassification,objectdetection,machinetranslationThepowerofCNNliesinitsabilitytolearnlocalstationarystructures,vialocalizedconvolutionfilter,andcomposethemtoformmulti-scalehierarchicalpatterns6M.M.Bronstein,J.Bruna,Y.LeCun,A.Szlam,P.Vandergheynst.Geometricdeeplearning:goingbeyondEuclideandata.IEEESignalProcessingMagazine,18-42,2017.TemporalconvolutionalnetworkConvolutionalneuralnetworksonimageConvolutionalNeuralNetworkLocalizedconvolutionalfiltersaretranslation-orshift-invariantWhichareabletorecognizeidenticalfeaturesindependentlyoftheirspatiallocationsOneinterestingproblemishowtogeneralizeconvolutiontonon-Euclideandomain,e.g.,graph?Irregularstructureofgraphposeschallengesfordefiningconvolution7LeCun,Y.,Bengio,Y.,andHinton,G.Deeplearning.Nature,521(7553):436,2015
TemplateMatchingX-ShapeFromCNNtographCNNConvolutioniswelldefinedinEuclideandata,grid-likenetworkNotstraightforwardtodefineconvolutiononirregularnetwork,widelyobservedinrealworld8Grid-likenetworkIrregularnetworksPreliminariesforGNNs
9ExistingmethodstodefineconvolutionSpectralmethods:defineconvolutioninspectraldomainConvolutionisdefinedviagraphFouriertransformandconvolutiontheorem.Themainchallengeisthatconvolutionfilterdefinedinspectraldomainis
notlocalizedinvertexdomain.Spatialmethods:defineconvolutioninthevertexdomainConvolutionisdefinedasaweightedaveragefunctionoverallverticeslocatedintheneighborhoodoftargetvertex.Themainchallengeisthatthesizeofneighborhoodvariesremarkablyacrossnodes,e.g.,power-lawdegreedistribution.10SpectralmethodsforgraphconvolutionalneuralnetworksSpectralmethods
12
GraphFourierTransform
13
Defineconvolutioninspectraldomain
14
Defineconvolutioninspectraldomain
15
Step1:GraphFourierTransformStep3GraphFourierInverseTransformStep2:Convolution
inspectraldomain
SpectralGraphCNNSpectralGraphCNN16
J.Bruna,W.Zaremba,A.Szlam,andY.
LeCun.Spectralnetworksandlocallyconnectednetworksongraphs.ICLR,2014.
ShortcomingsofSpectralgraphCNN17
ChebyNet:parameterizingfilterParameterizingconvolutionfilterviapolynomialapproximationChebyNet18
M.Defferrard,X.Bresson,P.Vandergheynst.Convolutionalneuralnetworksongraphswithfastlocalizedspectralfiltering.NeuraIPS,2016.ChebyNetvs.SpectralGraphCNN
19
M.Defferrard,X.Bresson,P.Vandergheynst.Convolutionalneuralnetworksongraphswithfastlocalizedspectralfiltering.NeuraIPS,2016.Isthismethodgoodenough?Whatcouldwedomore?GWNN:graphwaveletneuralnetwork
20
Fouriervs.Wavelet21FourierBasisDenseNotlocalizedHighComputationalcostB.Xu,H.Shen,Q.Cao,Y.Qiu,X.Cheng.Graphwaveletneuralnetwork.ICLR2019.
WaveletBasisSparseLocalizedLowComputationalcostGWNN:graphwaveletneuralnetworkGraphconvolutionviawavelettransformGraphwaveletneuralnetwork22Replacingbasis
GWNNvs.ChebyNetBenchmarkdatasetsResultsatthetaskofnodeclassification23SpatialmethodsforgraphconvolutionalneuralnetworksSpatialMethodsforGraphCNNByanalogyWhatcanwelearnfromthearchitectureofstandardconvolutionalneuralnetwork?251.DetermineNeighborhood2.ImposeanorderinneighborhoodM.Niepert,M.Ahmed,K.Kutzkov.LearningConvolutionalNeuralNetworksforGraphs.ICML,2016.3.ParametersharingSpatialMethodsforGraphCNNByanalogyForeachnode,selectthefixednumberofnodesasitsneighboringnodes,accordingtocertainproximitymetricImposeanorderaccordingtotheproximitymetricParametersharing26M.Niepert,M.Ahmed,K.Kutzkov.LearningConvolutionalNeuralNetworksforGraphs.ICML,2016.1.DetermineNeighborhood2.Imposeanorderinneighborhood3.ParametersharingSpatialMethodsforGraphCNNGraphSAGESamplingneighborsAggregatingneighbors27GraphSAGE:InductiveLearningW.L.Hamilton,R.Ying,J.Leskovec.InductiveRepresentationLearningonLargeGraphs.NeuraIPS2017Generalframeworkofgraphneuralnetworks:AggregatetheinformationofneighboringnodestoupdatetherepresentationofcenternodeSpatialMethodsforGraphCNNGCN:GraphConvolutionNetworkAggregatinginformationfromneighborhoodviaanormalizedLaplacianmatrixSharedparametersarefromfeaturetransformation28T.N.Kipf,andM.Welling.Semi-supervisedclassificationwithgraphconvolutionalnetworks.ICLR2017ParameterforfeatureTransformationSpatialMethodsforGraphCNNGAT:GraphAttentionNetworkLearningtheaggregationmatrix,i.e.,LaplacianmatrixinGCN,viaattentionmechanism,i.e.,aweightedmeanpooling.SharedparameterscontaintwopartsParametersforfeaturetransformationParametersforattention29AttentionMechanisminGATParameterofAttentionmechanismParameterforfeatureTransformationP.Velickovic,G.Cucurull,A.Casanova,A.Romero,P.Lio,Y.Bengio.GraphAttentionNetworks.NeuraIPS2018.SpatialMethodsforGraphCNNMoNet:AgeneralframeworkforspatialmethodsDefinemultiplekernelfunctions,parameterizedornot,tomeasurethesimilaritybetweentargetnodeandothernodesConvolutionkernelsaretheweightsofthesekernelfunctions30F.Monti,D.Boscaini,J.Masci,E.Rodola,J.Svoboda,M.M.Bronstein.GeometricdeeplearningongraphsandmanifoldsusingmixturemodelCNNs.CVPR2017.ConvolutionkernelSpectralmethodsvs.SpatialmethodsConnectionsSpectralmethodsarespecialcasesofspatialmethodsDifferenceSpectralmethodsdefinekernelfunctionsviaanexplicitspacetransformation,i.e.,projectingintospectralspaceSpatialmethodsdirectlydefinekernelfunctions31SpectralmethodsSpatialMethodsKernelfunction:CharacterizingthesimilarityordistanceamongnodesSpectralmethods:RecapSpectralCNNChebyNetGCN32
FrameworkofGraphNeuralNetwork
31GraphPoolingGraphPoolingviagraphcoarseningGraphcoarseningMergingnodesintoclustersandtakeeachclusterasasupernodeNodemergingcouldbedoneaprioriorduringthetrainingprocessofgraphconvolutionalneuralnetworks,e.g,DiffPooling35Ying,R.,You,J.,Morris,C.,Ren,X.,Hamilton,W.L.,andLeskovec,J.Hierarchicalgraphrepresentationlearningwithdifferentiablepooling,NeuraIPS2018GraphpoolingvianodeselectionNodeselectionLearnametrictoquantifytheimportanceofnodesandselectseveralnodesaccordingtothelearnedmetric36J.Lee,I.Lee,J.Kang.Self-attentionGraphPooling,ICML2019.OutlineBriefreviewongraphneuralnetworkSeveralresearchdirectionsonExpressivepowerOver-smoothproblemApplicationsofgraphneuralnetworks37HowabouttheexpressivepowerofGNNs?GraphNeuralNetworksGraphneuralnetworks(GNNs)gainedremarkablesuccessAchievingstate-of-the-artempiricalperformanceinnodeclassification,linkprediction,andgraphclassification.ThedesignofnewGNNsismostlybasedonempiricalintuition,heuristicsandexperimentaltrial-and-error.WelacktheoreticalunderstandingofthepropertiesandlimitationsofGNNs.OnefundamentalproblemistheexpressivepowerofGNNs39NodeclassificationGraphclassificationAboutExpressivePower
40K.Hornik,M.Stinchcombe,H.White.Multilayerfeedforwardnetworksareuniversalapproximators.Neuralnetworks,2(5):359–366,1989.M.Raghu,B.Poole,J.Kleinberg,S.Ganguli,J.S.Dickstein.OntheExpressivePowerofDeepNeuralNetworks.ICML,2017.GNNsfornodeclassificationExample:1-layerGCN41R.Sato.ASurveyonTheExpressivePowerofGraphNeuralNetworks.arXiv:2003.04078,2020Limitedexpressivepowerof1-layerGCN:itcannotfullydistinguishallnodesGNNsfornodeclassificationExample:2-layerGCN42R.Sato.ASurveyonTheExpressivePowerofGraphNeuralNetworks.arXiv:2003.04078,20202-layerGCNcanfullydistinguishallnodesinthistoyexample.DepthofGNNsmatters.CantheexpressivepowerofGNNsbeimprovedinfinitelyviaimprovingtheirdepth?Isthereabound?Layer0:Layer1:Layer2:Weisfeiler-LehmanIsomorphismTestWLtestiswidelyusedtojudgewhethertwographs,labeledorunlabeled,aretopologicallyidentical,orhowtheyaretopologicallysimilar.43N.Shervashidze,P.Schweitzer,E.J.Leeuwen,K.Mehlhorn,K.M.Borgwardt.Weisfeiler-LehmanGraphKernels.JMLR,2011.Example:similaritymeasurement.WLTest:Subtree-basedGraphKernel
44N.Shervashidze,P.Schweitzer,E.J.Leeuwen,K.Mehlhorn,K.M.Borgwardt.Weisfeiler-LehmanGraphKernels.JMLR,2011.ForWLtest,graphfeaturesareessentiallycountsofdifferentrootedsubtreesinthegraph.ConnectionBetweenGNNandWLTestWLtestprovidesanupperboundfortheexpressivepoweroftheaggregation-basedGNNs.AmaximallypowerfulGNNnevermaptwodifferentneighborhoodstothesamerepresentation,i.e.,theaggregationfunctionovermultisetisinjective.45K.Xu,W.Hu,J.Leskovec,S.Jegelka.Howpowerfularegraphneuralnetworks?ICLR,2019.Injectivefunction
ForpopularGNNs,likeGCNandGraphSAGE,theiraggregationfunctionsareinherentlynotinjective.GraphIsomorphismNetworkBasicidea:composeauniversalinjectiveaggregationfunctionoveranodeandthemultisetofitsneighborsGraphisomorphismnetwork46K.Xu,W.Hu,J.Leskovec,S.Jegelka.Howpowerfularegraphneuralnetworks?ICLR,2019.
①M(fèi)ultiple-layerperceptronoffersuniversalapproximator.②Sumpoolingoffersinjectivecondition.①②PrincipalneighborhoodaggregationPrincipalneighborhoodaggregationforgraphDefinemultipleaggregationfunctionstoimprovetheexpressivepowerofGNN47GabrieleCorso,LucaCavalleri,DominiqueBeaini,PietroLio,andPetarVelickovic.Principalneighbourhoodaggregationforgraphnets.NeurIPS2020.Injectiveaggregationfunctionineachlayertoachievehighexpressivepowerisnotalwaysnecessary.ExperimentalValidationValidatingexpressivepowerorrepresentationcapabilityMetric:trainingaccuracyTask:graphclassificationDatasets:bioinformaticsandsocialnetworks48K.Xu,W.Hu,J.Leskovec,S.Jegelka.Howpowerfularegraphneuralnetworks?ICLR,2019.PerformanceonGraphClassificationDoesthehighexpressivepowerofGNNimplygoodperformanceondown-streamtask,e.g.,graphclassification?Metric:testaccuracy49K.Xu,W.Hu,J.Leskovec,S.Jegelka.Howpowerfularegraphneuralnetworks?ICLR,2019.HighexpressivepowerdoesnotalwaysbringgoodperformanceNotethat:lowexpressivepoweralwaysimpliesbadperformanceIsexpressivepowernecessary?ExpressivepoweroffersusatheoreticalguideforunderstandingthecapabilityofGNNs.Whether,andtowhatdegree,areGNNsuniversalapproximitortofunctionsmappinggraphstoreal-valuedvector?Forgraphclassification,No!!!Fornodeclassification,itisalmost.Forspecifictasks,itisnotpracticallynecessarytoseekhighexpressivepowerforperformanceimprovementThispartlyexplainswhyGCN,GraphSAGEworkswellalthoughtheirexpressivepowerislessthanGIN.Whatwereallyneedisauniversalfunctionthatcanmapsimilarobjects(nodesorgraphs)tocloserepresentations,facilitatingdown-streamtasks.50Over-smoothissueOver-smoothissueOver-smoothphenomenoninGCN52Y.Rong,W.Huang,T.Xu,J.Huang.DropEdge:TowardsDeepGraphConvolutionalNetworksonNodeClassification,ICLR2020.WhydeeperGNNsperformworsethantheirshallowcounterparts?Over-smoothissueTheover-smoothissuesufferedbyGCNrootsinitscross-layershared,hard-codedaggregationmatrixGAT,usingself-attentiontodetermineaggregationmatrix,hasnointrinsicover-smoothissue,especiallymulti-headGATOver-smoothissueconfoundswithover-fittingissue53Y.Rong,W.Huang,T.Xu,J.Huang.DropEdge:TowardsDeepGraphConvolutionalNetworksonNodeClassification,ICLR2020.PotentialapplicationsApplicationsThreemajorscenariosNode-levelNodeclassification:predictthelabelofnodesaccordingtoseverallabelednodesandgraphstructureLinkprediction:predicttheexistenceoroccurrenceoflinksamongpairsofnodesGraph-levelGraphclassification:predictthelabelofgraphsvialearningagraphrepresentationusinggraphCNNSignal-levelSignalclassification,similartoimageclassificationwhichissignal-levelscenarioonagrid-likenetwork55Applications56RecommendationR.Ying,R.He,K.Chen,P.Eksombatchai,W.L.Hamilton,J.Leskovec.GraphConvolutionalNeuralNetworksforWeb-ScaleRecommenderSystems.KDD2018.X.He,K.Deng,X.Wang,Y.Li,Y.Zhang,M.Wang.LightGCN:SimplifyingandPoweringGraphConvolutionNetworkforRecommendation.SIGIR2020.TexclassificationL.Yao,C.Mao,Y.Luo,GraphConvolutionalNetworksforTextClassification.AAAI2019.KnowledgeGraphApplicationsinotherfieldsGNNsforquantumchemistryPredictthequantumpropertiesofmoleculesusingGNNsTraditionalmethods,i.e.,DFT(DensityFunctionalTheory),iscomputationallyexpensiveM
溫馨提示
- 1. 本站所有資源如無(wú)特殊說(shuō)明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒(méi)有圖紙預(yù)覽就沒(méi)有圖紙。
- 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
- 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
- 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。
最新文檔
- 《10 交通安全小常識(shí)》(說(shuō)課稿)-2023-2024學(xué)年四年級(jí)上冊(cè)綜合實(shí)踐活動(dòng)長(zhǎng)春版
- 23《梅蘭芳蓄須》說(shuō)課稿2024-2025學(xué)年統(tǒng)編版語(yǔ)文四年級(jí)上冊(cè)
- 14《我要的是葫蘆》第一課時(shí) 說(shuō)課稿-2024-2025學(xué)年語(yǔ)文二年級(jí)上冊(cè)統(tǒng)編版
- Unit5 The colourful world第三課時(shí)(說(shuō)課稿)-2024-2025學(xué)年人教PEP版(2024)英語(yǔ)三年級(jí)上冊(cè)
- 2024-2025學(xué)年高中歷史 第四單元 工業(yè)文明沖擊下的改革 第12課 俄國(guó)農(nóng)奴制改革(2)教學(xué)說(shuō)課稿 岳麓版選修1
- 2025合同約定的“滯納金”是否可以視為違約金
- 2025建安施工合同文本
- 2025審計(jì)業(yè)務(wù)合同范本
- 2025建設(shè)工程施工專業(yè)分包合同
- Chapter5 Signs We See (說(shuō)課稿)-2024-2025學(xué)年新思維小學(xué)英語(yǔ)2A
- 《自主神經(jīng)系統(tǒng)》課件
- 2025集團(tuán)公司內(nèi)部借款合同范本
- 2025年山西地質(zhì)集團(tuán)社會(huì)招聘高頻重點(diǎn)提升(共500題)附帶答案詳解
- 四川省綿陽(yáng)市2025屆高三第二次診斷性考試思想政治試題(含答案)
- 2024-2025學(xué)年遼寧省沈陽(yáng)市沈河區(qū)七年級(jí)(上)期末英語(yǔ)試卷(含答案)
- 2024-2025學(xué)年初中七年級(jí)上學(xué)期數(shù)學(xué)期末綜合卷(人教版)含答案
- T型引流管常見(jiàn)并發(fā)癥的預(yù)防及處理
- 2024-2025學(xué)年人教新版九年級(jí)(上)化學(xué)寒假作業(yè)(九)
- 2024年計(jì)算機(jī)二級(jí)WPS考試題庫(kù)(共380題含答案)
- 2022年全國(guó)醫(yī)學(xué)博士英語(yǔ)統(tǒng)一考試試題
- 《工業(yè)自動(dòng)化技術(shù)》課件
評(píng)論
0/150
提交評(píng)論