Infothek
Neue Workshop-Beiträge auf der NeurIPS 2023
Dieses Jahr haben es zwei Beiträge von unter anderem IES-Autoren auf Workshops der Konferenz NeurIPS (Neural Information Processing Systems) geschafft:
- Veronica Lachi, Alice Moallemy-Oureh, Andreas Roth und Pascal Welke haben einen Workshop-Beitrag mit dem Titel Graph Pooling Provably Improves Expressivity auf dem NeurIPS 2023 Workshop New Frontiers in Graph Learning untergebracht. Und darum geht es:
In the domain of graph neural networks (GNNs), pooling operators are fundamental to reduce the size of the graph by simplifying graph structures and vertex features. Recent advances have shown that well-designed pooling operators, coupled with message-passing layers, can endow hierarchical GNNs with an expressive power regarding the graph isomorphism test that is equal to the Weisfeiler-Leman test. However, the ability of hierarchical GNNs to increase expressive power by utilizing graph coarsening was not yet explored. This results in uncertainties about the benefits of pooling operators and a lack of sufficient properties to guide their design. In this work, we identify conditions for pooling operators to generate WL-distinguishable coarsened graphs from originally WL-indistinguishable but non-isomorphic graphs. Our conditions are versatile and can be tailored to specific tasks and data characteristics, offering a promising avenue for further research. - Alice Moallemy-Oureh, Silvia Beddar-Wiesing, Rüdiger Nather und Josephine Thomas haben ihren Beitrag Marked Neural Spatio-Temporal Point Process Involving a Dynamic Graph Neural Network auf dem Temporal Graph Learning Workshop untergebracht. Und darum geht es:
Spatio-Temporal Point Processes (STPPs) have recently become increasingly interesting for learning dynamic graph data since many scientific fields, ranging from mathematics, biology, social sciences, and physics to computer science, are naturally related and dynamic. While training Recurrent Neural Networks and solving PDEs for representing temporal data is expensive, TPPs were a good alternative. The drawback is that constructing an appropriate TPP for modeling temporal data requires the assumption of a particular temporal behavior of the data. To overcome this problem, Neural TPPs have been developed that enable learning of the parameters of the TPP. However, the research is relatively young for modeling dynamic graphs, and only a few TPPs have been proposed to handle edge-dynamic graphs. To allow for learning on a fully dynamic graph, we propose the first Marked Neural Spatio-Temporal Point Process (MNSTPP) that leverages a Dynamic Graph Neural Network to learn Spatio-TPPs to model and predict any event in a graph stream. In addition, our model can be updated efficiently by considering single events for local retraining.
Aktuelles
Neue Workshop-Beiträge auf der NeurIPS 2023
Dieses Jahr haben es zwei Beiträge von unter anderem IES-Autoren auf Workshops der Konferenz NeurIPS (Neural Information Processing Systems) geschafft:
- Veronica Lachi, Alice Moallemy-Oureh, Andreas Roth und Pascal Welke haben einen Workshop-Beitrag mit dem Titel Graph Pooling Provably Improves Expressivity auf dem NeurIPS 2023 Workshop New Frontiers in Graph Learning untergebracht. Und darum geht es:
In the domain of graph neural networks (GNNs), pooling operators are fundamental to reduce the size of the graph by simplifying graph structures and vertex features. Recent advances have shown that well-designed pooling operators, coupled with message-passing layers, can endow hierarchical GNNs with an expressive power regarding the graph isomorphism test that is equal to the Weisfeiler-Leman test. However, the ability of hierarchical GNNs to increase expressive power by utilizing graph coarsening was not yet explored. This results in uncertainties about the benefits of pooling operators and a lack of sufficient properties to guide their design. In this work, we identify conditions for pooling operators to generate WL-distinguishable coarsened graphs from originally WL-indistinguishable but non-isomorphic graphs. Our conditions are versatile and can be tailored to specific tasks and data characteristics, offering a promising avenue for further research. - Alice Moallemy-Oureh, Silvia Beddar-Wiesing, Rüdiger Nather und Josephine Thomas haben ihren Beitrag Marked Neural Spatio-Temporal Point Process Involving a Dynamic Graph Neural Network auf dem Temporal Graph Learning Workshop untergebracht. Und darum geht es:
Spatio-Temporal Point Processes (STPPs) have recently become increasingly interesting for learning dynamic graph data since many scientific fields, ranging from mathematics, biology, social sciences, and physics to computer science, are naturally related and dynamic. While training Recurrent Neural Networks and solving PDEs for representing temporal data is expensive, TPPs were a good alternative. The drawback is that constructing an appropriate TPP for modeling temporal data requires the assumption of a particular temporal behavior of the data. To overcome this problem, Neural TPPs have been developed that enable learning of the parameters of the TPP. However, the research is relatively young for modeling dynamic graphs, and only a few TPPs have been proposed to handle edge-dynamic graphs. To allow for learning on a fully dynamic graph, we propose the first Marked Neural Spatio-Temporal Point Process (MNSTPP) that leverages a Dynamic Graph Neural Network to learn Spatio-TPPs to model and predict any event in a graph stream. In addition, our model can be updated efficiently by considering single events for local retraining.
Termine
Zurück
Neue Workshop-Beiträge auf der NeurIPS 2023
Dieses Jahr haben es zwei Beiträge von unter anderem IES-Autoren auf Workshops der Konferenz NeurIPS (Neural Information Processing Systems) geschafft:
- Veronica Lachi, Alice Moallemy-Oureh, Andreas Roth und Pascal Welke haben einen Workshop-Beitrag mit dem Titel Graph Pooling Provably Improves Expressivity auf dem NeurIPS 2023 Workshop New Frontiers in Graph Learning untergebracht. Und darum geht es:
In the domain of graph neural networks (GNNs), pooling operators are fundamental to reduce the size of the graph by simplifying graph structures and vertex features. Recent advances have shown that well-designed pooling operators, coupled with message-passing layers, can endow hierarchical GNNs with an expressive power regarding the graph isomorphism test that is equal to the Weisfeiler-Leman test. However, the ability of hierarchical GNNs to increase expressive power by utilizing graph coarsening was not yet explored. This results in uncertainties about the benefits of pooling operators and a lack of sufficient properties to guide their design. In this work, we identify conditions for pooling operators to generate WL-distinguishable coarsened graphs from originally WL-indistinguishable but non-isomorphic graphs. Our conditions are versatile and can be tailored to specific tasks and data characteristics, offering a promising avenue for further research. - Alice Moallemy-Oureh, Silvia Beddar-Wiesing, Rüdiger Nather und Josephine Thomas haben ihren Beitrag Marked Neural Spatio-Temporal Point Process Involving a Dynamic Graph Neural Network auf dem Temporal Graph Learning Workshop untergebracht. Und darum geht es:
Spatio-Temporal Point Processes (STPPs) have recently become increasingly interesting for learning dynamic graph data since many scientific fields, ranging from mathematics, biology, social sciences, and physics to computer science, are naturally related and dynamic. While training Recurrent Neural Networks and solving PDEs for representing temporal data is expensive, TPPs were a good alternative. The drawback is that constructing an appropriate TPP for modeling temporal data requires the assumption of a particular temporal behavior of the data. To overcome this problem, Neural TPPs have been developed that enable learning of the parameters of the TPP. However, the research is relatively young for modeling dynamic graphs, and only a few TPPs have been proposed to handle edge-dynamic graphs. To allow for learning on a fully dynamic graph, we propose the first Marked Neural Spatio-Temporal Point Process (MNSTPP) that leverages a Dynamic Graph Neural Network to learn Spatio-TPPs to model and predict any event in a graph stream. In addition, our model can be updated efficiently by considering single events for local retraining.