Abstract [eng] |
With the increasing complexities in sequential data handling and in-terpretation, dynamic artificial neural networks have gained significant prominence. The structures incorporating time delays in their synapses have shown potential in various applications ranging from natural lan-guage processing to financial forecasting. However, a comprehensive comparative study explaining the differences, advantages, and potential drawbacks of dynamic neural networks remains a gap in the literature.This research provides a detailed comparison of 11 dynamic neural networks known for their time-dependent characteristics: Time Delay Neural Network, Time Derivative Neural Network, Bi-directional Neural Network, Finite Impulse Response Multi-Layer Perceptron, Simplified Finite Impulse Response Multi-Layer Perceptron, Infinite Impulse Re-sponse Neural Network, Gama Memory Neural Network, Lattice Ladder Neural Network, Recurrent Neural Network, Long Short-Term Memory, and Gated Recurrent Unit.To comprehensively understand the performance and properties of each dynamic neural network, this comparison is based on several pivotal metrics. First of all, the accuracy of each model is assessed to understand its predictive correctness. The magnitude of the model’s error over epochs, known as loss, provides further insights into its learning progression. An-other significant aspect we observe is the training time which reveals how efficiently a model can adapt and learn using gradient descent. The size of the model, determined by the number of parameters, influences deploy-ment and scalability considerations, making it another crucial factor in our analysis. Furthermore, the robustness of each dynamic neural network is tested against noisy or adversarial conditions, while their tendencies to-wards overfitting give us a clear picture of their generalization capabilities versus the risk of memorizing training data. Lastly, scalability is probed by examining how each dynamic neural network’s performance evolves as the complexities or sizes of datasets increase. |