Abstract:
Delay differential equations (DDE) have applications in many different scientific disciplines and show up in mathematical models of processes that evolve through time, where the rate of evolution is conditional on both the present and past states of the process. They also have significant applications in Algebra, where they are used to solve problems involving sequences and series, as well as in the analysis of algorithms and computational methods. Extensive new studies in fields as disparate as biology, economics, and physics all point to the importance of DDEs. When models based on ordinary differential equations (ODEs) fail, in particular, these methods become indispensable. In this study, we recommend a machine learning (ML) strategy for solving ODEs. An artificial neural network (ANN) with five fully connected layers are built and trained using examples of solutions to the differential equations. The ANN prediction is then compared to the Runge-Kutta (RK) scheme solution for each ODE. Each ODE's loss function is shown for inspection purposes.