Abstract
The increasing complexity of modern networks, from communication infrastructures to power grids and social networks, demands models that capture both structural dependencies and nonlinear dynamics of long memory. We proposed a hybrid framework that unified deep learning (graph neural networks, recurrent/attention modules) with fractional calculus to model nonlocal memory, anomalous diffusion, and self-similarity. Fractional differential formulations provide a principled description of network evolution, for which we stated a checkable stability condition; the learning pipeline coupled gradient-based training with fractional operators for robust, interpretable prediction. On Los Angeles metropolitan area traffic (METR-LA) dataset, the proposed ensemble integrated deep fractional model (EIDFM) achieved mean absolute error (MAE) around 6.4 and root mean square error (RMSE) 10.8, which showed improvement over the strongest baseline hybrid (CNN-LSTM): MAE 7.8, RMSE 12.5) by 18% and 14%, respectively; mean absolute percentage error (MAPE) droped from 6.3% to 5.2% (≈ 17% relative), while R2 rose from 0.91 to 0.94. Results were reported as mean±std over five seeds with paired significance tests. A lightweight efficiency analysis showed modest overhead relative to the baseline (inference 3.2 ± 0.3 ms/step vs. 2.8 ± 0.2; parameters 12.8M vs. 11.3M), justified by the accuracy gains. These findings indicated that integrating fractional operators with graph-based deep learners yielded a mathematically grounded and practically effective paradigm for understanding and managing complex network dynamics.
| Original language | English |
|---|---|
| Pages (from-to) | 26717-26743 |
| Number of pages | 27 |
| Journal | AIMS Mathematics |
| Volume | 10 |
| Issue number | 11 |
| DOIs | |
| Publication status | Published - 18 Nov 2025 |
Keywords
- complex networks
- deep learning
- fractional calculus
- mathematical modeling
- network dynamics
- nonlinear systems