版權說明:本文檔由用戶提供并上傳,收益歸屬內容提供方,若內容存在侵權,請進行舉報或認領
文檔簡介
TowardsEfficientTemporalGraphLearning:
Algorithms,Frameworks,andTools
RuijieWangWanyuZhaoDachunSunCharithMendisTarekAbdelzaher
UniversityofIllinoisUrbana-Champaign
{ruijiew2,wanyu2,dsun18,charithm,zaher}@I
Time:1:45PM-17:30PM,October21,2024
Location:Room120C,BoiseCentre,Boise,ID
Webpage:
https://wjerry5.github.io/cikm2024-tutorial/
Contents
?PartI-Introduction
?PartII-Data-EfficientTemporalGraphNeuralNetwork
?30-minCoffeeBreak
?PartIII-Resource-EfficientTemporalGraphNeuralNetwork
?PartIV-DiscussionandFutureDirections
3
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
BroadApplicationDomainsofGraphData
SocialNetworkAnalysisKnowledgeGraphReasoningWebMining
RecommendationScientificDiscoveryLLMPrompting&Reasoning
Universallanguagefordescribinginterconnecteddata!
4
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Real-WorldGraphsareEvolving–TemporalGraphs
TemporalFactsinKGs
MolecularDynamics
UserOnlineBehaviors
DynamicalSystems
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Real-WorldGraphsareEvolving–TemporalGraphs
oGraphshavetime-evolvingcomponents,e.g.,
oTopologystructures
oAdd/deletenodes
oAdd/deleteedges
oInputfeatures
oNode-levelfeatures
oEdge-levelfeatureso…
Dynamicedges[1]Dynamicnodeset[2]
Dynamicnode&edgefeatures[3]
[1]
/temporal-graph-networks-ab8f327f2efe
.
[2]Wanget.al.,LearningtoSampleandAggregate:Few-shotReasoningoverTemporalKnowledgeGraphs.
[3]ThomasKipf,EthanFetaya,Kuan-ChiehWang,MaxWelling,andRichardZemel.Neuralrelationalinferenceforinteractingsystems.
5
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
TemporalGraphs–Definition
oDiscrete-timevs.continuous-timetemporalgraphs
oDiscrete-timetemporalgraphs
oG={G1,…,GT?1,GT},
owhereGt=(?t,vt,xt)denotesone
snapshot.Discrete-timeexample[1]
oContinuous-timetemporalgraphs
oG={(ei,ej,t,+/?)},
owhereei,ej∈?,0≤t<T
Continuous-timeexample[2]
Howtoenabledeeplearningontemporalgraphs?
[1]Fuet.al.,NaturalandArtificialDynamicsinGNNs.
[2]Conget.al.,OntheGeneralizationCapabilityofTemporalGraphLearningAlgorithms:TheoreticalInsightsandaSimplerMethod.
6
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
GraphNeuralNetworks(GNNs)
oNodeshaverepresentationsateachlayer,wherelayer-0representationsareinputfeaturesx.
oBasicoperations:Sample&Aggregation+Update
Step1:Sample&Aggregate
Combinemsgsfromneighbors
Step2:Update
[1]WilliamL.HamiltonandJianTang.“GraphRepresentationLearning”.TutorialatAAAI2019.
Applyneuralnetworks
7
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
TemporalGraphNeuralNetworks(TGNNs)
oNodeshaverepresentationsateachlayer,wherelayer-0representationsareinputfeaturesx.
oNewoperationdesigns:Sample&Aggregation+Update
Step1:Sample&Aggregate
Combinemsgsfromneighbors
8
Step2:Update
Applyneuralnetworks
[1]WilliamL.HamiltonandJianTang.“GraphRepresentationLearning”.TutorialatAAAI2019.
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
TemporalGraphNeuralNetworks(TGNNs)
oCategoriesofTGNNs
oTGNNwithRNN
oTGNNwithselfattention
JODIE[1]
DySAT[2]
oTGNNwithmemory
TGAT[3]
oTGNNwithmemory&selfattention
o……
TGN[4]
[1]Kumaretal.,JODIE:PredictingDynamicEmb.TrajectoryinTemporalInteractionNetworks.[2]Sankaretal.,DySAT:DeepNeuralRepr.LearningonDynamicGraphsviaSelf-Attention
Networks.
[3]Xuetal.,InductiveRepresentationLearningonTemporalGraphs
[4]Rossietal.,TemporalGraphNetworksforDeepLearningonDynamicGraphs
9
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
TrainingandinferencepipelineofTGNNs
Evaluation
InputGraph
TGNNs
PredictionHead
Predictions
Labels
Loss
Function
10
oRepresentationlearning+task-relatedoptimization.
Time-EvolvingEmbeddings
[1]Congetal.,DoWeReallyNeedComplicatedModelArchitecturesForTemporalNetworks?
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Data-EfficiencyIssueofTGNNs
Evaluation
TGNNs
Time-EvolvingEmbeddings
PredictionHead
Predictions
Labels
Loss
Function
oTrainingTGNNsrequiresrelativelyabundantlabeleddata.
InputGraph
oInsufficientlabeleddataforreal-worldapplications:
oIndirectlabels
oScarcityoftask-specificlabels
oLimitedlabelsfornewtasks/distributions
11
12
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
ComputationpipelineofTGNNs
oTrainingcomputation
FeatureFetching
Model
Computation
FeatureUpdate
Inference
oInferencecomputation
NeighborSampling
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Resource-EfficiencyIssueofTGNNs
NeighborSampling
Model
Computation
FeatureUpdate
Inference
oFastgrowingoftemporalgraphsv.s.limitedresources
FeatureFetching
13
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
ScopeofThisTutorial
PartII
Data-EfficientTGNN
EfficientTGNN
PartIII
Resource-EfficientTGNN
oWefocusonalgorithm
designandoptimization
techniquestoaddressthechallengesposedby
insufficientlabels.
14
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
ScopeofThisTutorial
EfficientTGNN
PartII
Data-EfficientTGNN
PartIII
Resource-EfficientTGNN
Self-SupervisedLearning
Weakly-SupervisedLearning
Few-ShotLearning
15
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
ScopeofThisTutorial
EfficientTGNN
PartII
Data-EfficientTGNN
PartIII
Resource-EfficientTGNN
Self-SupervisedLearning
oWefocusonsystem
Weakly-SupervisedLearning
accelerationtoenablelarge-scaletrainingandinferencewithlimited
Few-ShotLearning
resources.
16
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
ScopeofThisTutorial
EfficientTGNN
PartII
Data-EfficientTGNN
PartIII
Resource-EfficientTGNN
Self-SupervisedLearning
TrainingAcceleration
Weakly-SupervisedLearning
InferenceAcceleration
Few-ShotLearning
DistributedTrainingAcceleration
17
Contents
?PartI-Introduction
?PartII-Data-EfficientTemporalGraphNeuralNetwork
?30-minCoffeeBreak(15:30-16:00)
?PartIII-Resource-EfficientTemporalGraphNeuralNetwork
?PartIV-DiscussionandFutureDirections
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
ScopeofThisTutorial
EfficientTGNN
PartII
Data-EfficientTGNN
PartIII
Resource-EfficientTGNN
TrainingAcceleration
Self-SupervisedLearning
InferenceAcceleration
Weakly-SupervisedLearning
DistributedTrainingAcceleration
Few-ShotLearning
19
20
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Self-SupervisedLearningonTemporalGraphs
?Introduction&Background
?Self-SupervisionbyReconstruction
?Self-SupervisionbyContrastiveApproach
?Self-SupervisionbyMultiviewConsistency
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Introduction–Self-SupervisedLearning(SSL)
oLearningusefulrepresentationswithoutrequiringlabeleddata.
oReliesontheinherentstructureandtemporaldynamicsofthegraphitself.
InductiveTask
Examineinferredrepresentationsofunseennodeby
predictingthefuturelinksbetweenunseennodesand
classifythembasedontheirinferredembedding
dynamically
TransductiveTask
Examinenodeembeddingsthathavebeenobservedin
training,viathefuturelinkpredictiontaskandthenode
classification.
t1t2tn
21
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
ChallengesonTemporalGraphs
oChallenge1:Nodeembeddingsshouldalsofunctionoftime.
oChallenge2:Temporalconstraintsonneighborhoodaggregationmethods.
oChallenge3:Possiblymultiplenodeinteractions.
[1]DaXu,ChuanweiRuan,EvrenKorpeoglu,SushantKumar,KannanAchan,InductiveRepresentationLearningonTemporalGraphs22
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Background–AttentionMechanismonGraphs
oQueryneighborsbykeysderivedfromtheirrepresentations,aggregatingtheir
valuebytheattentionweight.
oQuestion:Howtoinvolvetemporalinformation?
PetarVeli?kovi?,GuillemCucurull,ArantxaCasanova,AdrianaRomero,PietroLiò,YoshuaBengio,GraphAttentionNetworks23
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Background–TimeEncoding
oAformof“positionalencoding”concatenatedtothenoderepresentation.
oGenerateavectorencodinggivenarealnumber.
oEncodingrepresentstimespanratherthanabsolutevalueoftime(Translation-invariance).
K(t1,ta):=(I(t1),J(ta)》K(t1+c,tz+c)=k(t1,ta)
UsingBochner’sTheoremandMonteCarloapproximation:
24
K(t,ta)~?工:1cos(urt1)cos(orta)+sin(urt1)sin(urta)
[1]DaXu,ChuanweiRuan,EvrenKorpeoglu,SushantKumar,KannanAchan,InductiveRepresentationLearningonTemporalGraphs
26
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Background–TemporalSubgraphSampling
oTemporalsubgraphsamplingiskeytobatch-wisetrainingandcontrastivepairconstruction.
oMessagepassingdirectionsmustalignwiththeobservedchronologicalorders.
oGivenatargetnumberofnodesforsubgraph,candidatescanbefurtherweightedby
structuralortemporalimportance.
oDegree,centrality,orPageRank
oTimeelapsed.
27
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Introduction–Self-SupervisedLearning(SSL)
oSSLparadigmstypicallygeneratesupervisionsignalsthroughdesignedtasks:
oTransductivefuturelinkreconstruction.Lossisbasedoncrossentropy.
oContrastivelearning:learningfrompositiveandnegativepairofexamples.Lossisbasedonsimilaritymeasure.
oMultiviewconsistency:representationsshouldberobustunderperturbationsandagreewitheachother.Lossisbasedonregularizations.
NewBatch
EdgeProbabilities
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyReconstruction:TGAT
oObjective:Producetime-awarerepresentationforatargetnodeattimepointt.
oMotivation:AnalogoustoGraphSAGEorGAT,takestemporalneighborhoodwithhiddenrepresentationsandtimestamps,andaggregate.
oMethod:Alocalaggregationoperator,usingattentionmechanism.
0
oLink
prediction
loss:
28
[1]DaXu,ChuanweiRuan,EvrenKorpeoglu,SushantKumar,KannanAchan,InductiveRepresentationLearningonTemporalGraphs
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyReconstruction:TGAT
oExperiments:Transductive&inductivelearningtaskforfuturelinkprediction.
[1]DaXu,ChuanweiRuan,EvrenKorpeoglu,SushantKumar,KannanAchan,InductiveRepresentationLearningonTemporalGraphs29
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyReconstruction:TemporalGraphNetworks(TGN)
oMotivation:Viewingdynamicgraphsassequencesoftimedevents.
oMethod:Fivemodulesthatprocessdynamicgraphsasaseriesofnode-wise
event,interactionevent,ordeletionevent,andsavethenodestatestomemory.
Aggregatedmessages
Rawmessages
Messages
Memory
m:(t)=msgs(s,(t"),sj(t"),At,ej(t))
Identity,MLP
Mostrecent,Mean
LSTM,GRU
[1]EmanueleRossi,BenChamberlain,FabrizioFrasca,DavideEynard,FedericoMonti,MichaelBronstein,TemporalGraphNetworksforDeepLearningonDynamicGraphs30
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyReconstruction:TemporalGraphNetworks(TGN)
oMotivation:Viewingdynamicgraphsassequencesoftimedevents.
oMethod:Fivemodulesthatprocessdynamicgraphsasaseriesofnode-wise
event,interactionevent,ordeletionevent,andsavethenodestatestomemory.
NewBatch
MemoryNodeEmbeddingsEdgeProbabilities
Identity,
TimeProjection
TemporalGraphAttention
31
TemporalGraphSum
[1]EmanueleRossi,BenChamberlain,FabrizioFrasca,DavideEynard,FedericoMonti,MichaelBronstein,TemporalGraphNetworksforDeepLearningonDynamicGraphs
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyReconstruction:TemporalGraphNetworks(TGN)
oExperiments:Transductive&inductivelearningtaskforfuturelinkprediction.
[1]EmanueleRossi,BenChamberlain,FabrizioFrasca,DavideEynard,FedericoMonti,MichaelBronstein,TemporalGraphNetworksforDeepLearningonDynamicGraphs32
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyContrastiveLearning:TGAT-CL
oMotivation:Noderepresentationprocessisingeneral“smooth”.
oMethod:Contrastthesamenoderepresentationovertime.
Q(tx,ty)=S(Itx-tyl)
[1]ShengTian,RuofanWu,LeileiShi,LiangZhu,TaoXiong,Self-supervisedRepresentationLearningonDynamicGraphs33
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyContrastiveLearning:TGAT-CL
oMotivation:Noderepresentationprocessisingeneral“smooth”.
oMethod:Contrastthesamenoderepresentationovertime.
oChallenge:Biasinnegativeexamplesampling.
sim(x,y)=x'yS(tx,ty)=s(Itx-tyl)
oContrastiveLoss
[1]ShengTian,RuofanWu,LeileiShi,LiangZhu,TaoXiong,Self-supervisedRepresentationLearningonDynamicGraphs34
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyContrastiveLearning:TGAT-CL
oMotivation:Noderepresentationprocessisingeneral“smooth”.
oMethod:Contrastthesamenoderepresentationovertime.
oChallenge:Biasinnegativeexamplesampling.
oDebiasedContrastiveLoss:
tt=p(c(x')=c(x))
[1]ShengTian,RuofanWu,LeileiShi,LiangZhu,TaoXiong,Self-supervisedRepresentationLearningonDynamicGraphs35
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyContrastiveLearning:TGAT-CL
oMotivation:DynamicnodeclassificationperformanceinaverageAUCanddynamiclinkpredictionperformanceinaverageprecision.
[1]ShengTian,RuofanWu,LeileiShi,LiangZhu,TaoXiong,Self-supervisedRepresentationLearningonDynamicGraphs36
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyContrastiveLearning:DySubC
oMotivation:Nodevs.subgraphrepresentations;temporalvs.staticrepresentations.
oMethod:Contrastbetweenapositivesubgraphsample,atemporalnegativesample,andastructuralnegativesample.
Readout:
[1]Ke-JiaChen,LinsongLiu,LinpuJiang,JingqiangChen,Self-SupervisedDynamicGraphRepresentationLearningviaTemporalSubgraphContrast37
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyContrastiveLearning:DySubC
oMotivation:Nodevs.subgraphrepresentations;temporalvs.staticrepresentations.
oMethod:Contrastbetweenapositivesubgraphsample,atemporalnegativesample,andastructuralnegativesample.
TrainingLoss:
L=L1+LZ
[1]Ke-JiaChen,LinsongLiu,LinpuJiang,JingqiangChen,Self-SupervisedDynamicGraphRepresentationLearningviaTemporalSubgraphContrast38
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyContrastiveLearning:DySubC
oExperiments:LinkpredictionintermsofaverageAUCscoreandAccuracy.
[1]Ke-JiaChen,LinsongLiu,LinpuJiang,JingqiangChen,Self-SupervisedDynamicGraphRepresentationLearningviaTemporalSubgraphContrast39
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyMultiviewConsistency:DyG2Vec
oMotivation:Embeddingsshouldbeconsistentundergraphperturbations.
oNewGraphEncoder:Accordingtoablationstudy,thesubgraphencoderisundirectedandnon-causal.
[1]MohammadAlomrani,MahdiBiparva,YingxueZhang,MarkCoates,DyG2Vec:EfficientRepresentationLearningforDynamicGraphs40
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyMultiviewConsistency:DyG2Vec
oMotivation:Embeddingsshouldbeconsistentundergraphaugmentations.
oMethod:Useedgedropoutandedgefeaturemaskingtoproducedifferent“views”,anduseregularization-basedSSLlossfunction
41
+viciz)+ciz'l
[1]MohammadAlomrani,MahdiBiparva,YingxueZhang,MarkCoates,DyG2Vec:EfficientRepresentationLearningforDynamicGraphs
[2]AdrienBardes,JeanPonce,YannLeCun,VICReg:Variance-Invariance-CovarianceRegularizationforSelf-SupervisedLearning
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
SSLbyMultiviewConsistency:DyG2Vec
oExperiments:LinkpredictionintermsofaverageAUCscoreandAccuracy.
[1]MohammadAlomrani,MahdiBiparva,YingxueZhang,MarkCoates,DyG2Vec:EfficientRepresentationLearningforDynamicGraphs42
43
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Data-EfficientTGNNCheckpoint
Self-SupervisedLearning
oIntroduction&Background
oReconstruction
oContrastiveLearning
oMultiviewApproach
Q&A
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
ScopeofThisTutorial
EfficientTGNN
PartII
Data-EfficientTGNN
PartIII
Resource-EfficientTGNN
Self-SupervisedLearning
TrainingAcceleration
InferenceAcceleration
Weakly-SupervisedLearning
Few-ShotLearning
DistributedTrainingAcceleration
44
45
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Weakly-SupervisedLearningonTemporalGraphs
?Introduction&Background
?Weak-SupervisionwithLimitedInformation
?Weak-SupervisiononSparseTemporalGraph
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Introduction–Weakly-SupervisedLearning
oLearninguseful(orbetter)representationsusinglimitedlabeledornoisydata.
oInherentstructureandtemporaldynamicsofthegraphitselfarestillimportant.
oChallenge1:Effectivelyexploitweakinformationinthetrainingprocess.
oChallenge2:Learningrepresentationsondynamicandnoisygraphs.
[1]Leftimageisfrom
/watch?v=WQb6h19PrJA
[2]LinhaoLuo,GholamrezaHaffari,ShiruiPan,GraphSequentialNeuralODEProcessforLinkPredictiononDynamicandSparseGraphs
46
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Weak-SupervisiononLimitedLabeledData:D2PT
oMotivation:DesignauniversalandeffectiveGNNforgraphlearningwithweakinformation(GLWI).Disclaimer:Thisworkisforstaticgraphs.
oMethod:ExecuteeffectiveinformationpropagationinGNNs.
[1]YixinLiu,KaizeDing,JianlingWang,VincentLee,HuanLiu,ShiruiPan,LearningStrongGraphNeuralNetworkswithWeakInformation47
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Weak-SupervisiononLimitedLabeledData:D2PT
oMotivation:DesignauniversalandeffectiveGNNforgraphlearningwithweakinformation(GLWI).Disclaimer:Thisworkisforstaticgraphs.
oMethod:ExecuteeffectiveinformationpropagationinGNNs.
[1]YixinLiu,KaizeDing,JianlingWang,VincentLee,HuanLiu,ShiruiPan,LearningStrongGraphNeuralNetworkswithWeakInformation48
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Weak-SupervisiononLimitedLabeledData:D2PT
oExperiments:ClassificationaccuracyinextremeGLWIscenario.
[1]YixinLiu,KaizeDing,JianlingWang,VincentLee,HuanLiu,ShiruiPan,LearningStrongGraphNeuralNetworkswithWeakInformation49
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Weak-SupervisiononSparseTemporalGraph:GSNOP
oObjective:Addressthesituationwherethereisnotenoughhistoricaldata.
oMotivation:Missinglinksarecommon.Howtolearnbetterrepresentationonsparsegraphsandpreventoverfitting.
[1]LinhaoLuo,GholamrezaHaffari,ShiruiPan,GraphSequentialNeuralODEProcessforLinkPredictiononDynamicandSparseGraphs50
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Weak-SupervisiononSparseTemporalGraph:GSNOP
oMethod:Treatthelinkpredictionasadynamic-changingstochasticprocessandemployneuralprocess.
[1]LinhaoLuo,GholamrezaHaffari,ShiruiPan,GraphSequentialNeuralODEProcessforLinkPredictiononDynamicandSparseGraphs51
PartI-IntroductionPartII-Data-EfficientTGNNPartIII-Resource-EfficientTGNNPartIV-Discussion&Future
Weak-SupervisiononSparseTemporalGraph:GSNOP
oMethod:Treatthelinkpredictionasadynamic-changingstochasticprocessandemployneuralprocess.
[1]LinhaoLuo,GholamrezaHaffari,ShiruiPan,GraphSequentialNeuralODEProcessforLinkPredictiononDynamicandSparseGraphs52
PartI-Introduction
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網頁內容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
- 4. 未經權益所有人同意不得將文件中的內容挪作商業(yè)或盈利用途。
- 5. 人人文庫網僅提供信息存儲空間,僅對用戶上傳內容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內容本身不做任何修改或編輯,并不能對任何下載內容負責。
- 6. 下載文件中如有侵權或不適當內容,請與我們聯(lián)系,我們立即糾正。
- 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 云南省師范大學附屬中學2021屆高三高考適應性月考卷(三)文科綜合試題掃描版含答案
- 【狀元之路】2020-2021學年新課標生物選修1-專題測評(六)植物有效成分的提取
- 四年級數(shù)學(四則混合運算)計算題專項練習與答案
- 三年級數(shù)學(上)計算題專項練習附答案
- 【狀元之路】2020-2021學年高中政治必修1一課一練:第六課-投資理財?shù)倪x擇
- 2021高考化學考點突破訓練:11-3烴的含氧衍生物
- 《金版教程》2022屆高考生物一輪總復習限時規(guī)范特訓-2-6細胞器-系統(tǒng)內的分工合作-
- 多媒體課件制作
- 《肝臟CT分段》課件
- 社會主義建設理論與實踐 第三版 課件 第七章 社會主義國家生態(tài)文明建設;第八章 社會主義國家執(zhí)政黨建設
- 基站外電引入建設指導意見
- 空調安裝和維修的培訓
- ??祱?zhí)法記錄儀解決方案
- 焊機安全技操作規(guī)程15篇
- 液化氣供應站安全管理制度和營業(yè)制度
- 高中學籍檔案卡
- CMS電子后視鏡遇見未來
- GB/T 397-2009煉焦用煤技術條件
- GB/T 21385-2008金屬密封球閥
- GB/T 18994-2003電子工業(yè)用氣體高純氯
- GB/T 13384-2008機電產品包裝通用技術條件
評論
0/150
提交評論