•  
  •  
 

Abstract

In this paper the initial boundary value problems for the linear telegraph equation in one and two space dimensions are considered. To find approximate solutions, a recently proposed optimization-free approach that utilizes artificial neural networks with one hidden layer is used, in which the connecting weights from the input layer to the hidden layer are chosen randomly and the weights from the hidden layer to the output layer are found by solving a system of linear equations. One of the advantages of this method, in comparison to the usual discretization methods for the two-dimensional linear telegraph equation, is that this artificial neural network method does not require time discretization and it produces the approximate solution in a closed analytic form. A numerical study on several examples for which the exact solutions are known is conducted and the dependence of the maximum absolute error and the root-mean-square deviation error on the size of the training set and on the number of nodes in the hidden layer is explored. It is shown numerically that, in general, both errors tend to decrease with the increase of the size of the training set and the size of the hidden layer.

Share

COinS