METHOD FOR ADAPTIVE SELECTION OF TIME INTERVALS FOR CONSTRUCTING GRAPHS OF TEMPORAL GRAPH NEURAL NETWORKS
DOI:
https://doi.org/10.20998/2079-0023.2025.02.19Keywords:
temporal graphs, adaptive time granularity, spectral analysis, structural drift, dynamic graphs, graph neural networks, edit metric, Laplacian eigenvalues, temporal dependenciesAbstract
The subject of research is the process of forming graph structures for temporal graph neural networks with adaptive selection of time interval granularity level. The aim of the work is to develop an approach to forming graph structures with adaptive granularity for temporal graph neural networks. Research tasks include: structuring approaches to selecting the granularity level of time intervals when forming graphs of temporal graph neural networks considering changes in the structure of these graphs; developing a method for adaptive selection of time intervals based on graph editing metrics and spectral analysis of graph structure. The developed method includes five stages: graph formation based on co-occurrence frequency of entities; calculation of editing rate between sequential graphs; spectral embedding of graphs through normalized symmetric Laplacian; computation of Kullback – Leibler divergence between spectral densities to detect structural drift; adaptive adjustment of time interval duration considering editing rate criteria and divergence magnitude. The method combines local graph editing metric and global metrics of spectral density, Kullback – Leibler divergence to detect not only the quantity of changes in the graph but also their impact on graph topology. This allows distinguishing noise from significant structural changes in the graph. The method provides automated selection of time granularity without using expert knowledge about threshold values for graph structure changes; reduction of computational costs for graph formation during periods of structure stability; specified accuracy of temporal dependency detection during periods of sharp graph structure changes. The practical significance of the obtained results lies in the possibility of representation and further analysis of dynamic processes in intelligent systems that operationally adapt to changes in relationship structure, for tasks of building explanations, recommendations, monitoring, analysis and forecasting in e-commerce systems, social networks, financial analysis, transportation monitoring.
References
Rossi E., Chamberlain B., Frasca F., Eynard D., Monti F., Bronstein M. Temporal Graph Networks for Deep Learning on Dynamic Graphs. Proceedings of the 37th International Conference on Machine Learning (ICML 2020). Vienna, Austria, PMLR 119, 2020, pp. 8230–8240.
Xu K., Hu W., Leskovec J., Jegelka S. How Powerful are Graph Neural Networks? Proceedings of the 7th International Conference on Learning Representations (ICLR 2019), 2019. P. 1–17.
Chalyi S. F., Leshchynskyi V. O. Temporalne predstavlennia kauzalnosti pry pobudovi poiasnen v intelektualnykh systemakh [Temporal representation of causality in the construction of explanations in intelligent systems]. Suchasni informatsiini systemy [Advanced Information Systems]. 2020, vol. 4, no. 3, pp. 8–13. DOI: 10.20998/2522-9052.2020.3.02. (In Ukr.)
Chala O. V. Pobudova temporalnykh pravyl dlia predstavlennia znan v informatsiinykh systemakh upravlinnia [Construction of temporal rules for the representation of knowledge in information control systems]. Suchasni informatsiini systemy [Advanced Information Systems]. 2018, vol. 2, no. 3, pp. 54–58. DOI: 10.20998/2522-9052.2018.3.09. (In Ukr.)
Pareja A., Domeniconi G., Chen J., Ma T., Suzumura T., Kanezashi H., Kaler T., Schardl T., Leiserson C. EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs. Proceedings of the AAAI Conference on Artificial Intelligence. 2020, vol. 34, no. 04, pp. 5363–5370. DOI: 10.1609/aaai.v34i04.5984.
Trivedi R., Farajtabar M., Biswal P., Zha H. DyRep: Learning Representations over Dynamic Graphs. Proceedings of the 7th International Conference on Learning Representations (ICLR 2019), 2019.
Zhang Y., Liu Q., Chen E. Gaia: Graph Neural Network with Temporal Shift-Aware Attention for E-commerce GMV Forecasting. Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 2024). 2024, pp. 3421–3430. DOI: 10.1145/3580305.3599456.
Riesen K., Bunke H. Approximate Graph Edit Distance Computation by Means of Bipartite Graph Matching. Image and Vision Computing. 2009, vol. 27, no. 7, pp. 950–959. DOI: 10.1016/j.imavis.2008.04.004.
Gama J., Žliobaitė I., Bifet A., Pechenizkiy M., Bouchachia A. A Survey on Concept Drift Adaptation. ACM Computing Surveys. 2014, vol. 46, no. 4, article 44, pp. 1–37. DOI: 10.1145/2523813.
Hamilton W. L., Ying R., Leskovec J. Inductive Representation Learning on Large Graphs. Advances in Neural Information Processing Systems (NeurIPS 2017). 2017, pp. 1024–1034.
Bifet A., Holmes G., Kirkby R., Pfahringer B. MOA: Massive Online Analysis. Journal of Machine Learning Research. 2010, vol. 11, pp. 1601–1604.
Sankar A., Wu Y., Gou L., Zhang W., Yang H. DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks. Proceedings of the 13th International Conference on Web Search and Data Mining (WSDM 2020). 2020, pp. 519–527. DOI: 10.1145/3336191.3371845.
Von Luxburg U. A Tutorial on Spectral Clustering. Statistics and Computing. 2007, vol. 17, no. 4, pp. 395–416. DOI: 10.1007/s11222-007-9033-z.
Robles-Kelly A., Hancock E. R. Graph Edit Distance from Spectral Seriation. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2005, vol. 27, no. 3, pp. 365–378. DOI: 10.1109/TPAMI.2005.56.
Chala O. V. Model uzahalnenoho predstavlennia temporalnykh znan v intelektualnykh informatsiinykh systemakh upravlinnia [Model of generalized representation of temporal knowledge in intelligent information control systems]. Suchasni informatsiini systemy [Advanced Information Systems]. 2020, vol. 4, no. 2, pp. 30–35. DOI: 10.20998/2522-9052.2020.2.05. (In Ukr.)
Chalyi S. F., Leshchynskyi V. O. Model poiasnennia v intelektualnii systemi na osnovi staniv protsesu pryiniattia rishen [An explanation model in an intelligent system at the basis of the states of the decision-making process]. Suchasni informatsiini systemy [Advanced Information Systems]. 2023, vol. 7, no. 1, pp. 5–11. DOI: 10.20998/2522-9052.2023.1.01. (In Ukr.)
Chalyi S. F., Leshchynskyi V. O. Deklaratyvno-temporalnyi pidkhid do pobudovy poiasnennia v intelektualnii informatsiinii systemi [Declarative-temporal approach to the construction of an explanation in an intelligent information system]. Suchasni informatsiini systemy [Advanced Information Systems]. 2020, vol. 4, no. 4, pp. 5–10. DOI: 10.20998/2522-9052.2020.4.01. (In Ukr.)
Huang S., Hitti Y., Rabusseau G., Rabbany R. Laplacian Change Point Detection for Dynamic Graphs. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD 2020). Virtual Event, CA, USA, 2020, pp. 349–358. DOI: 10.1145/3394486.3403077.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
