Neuronal networks, or rather artificial neuronal networks (ANN), are a subfield of AI (Artificial Intelligence).
Neural networks are a method of doing machine learning. Their structure was inspired by the way in which a biological brain works. An artifical neural network consists of neurons. The neurons are (to put it simply, other versions exist) arranged in layers.
In each layer there is a certain number of neurons. Each neuron of a layer is connected to the neurons of the previous or following layer. Each neuron can pick up data, transform it and output it again. Depending on the layer, the input data (input layer) is taken or the results of the previous layer are taken. In the same way, the output data is passed on to the next layer or the final result (output layer) is given.
Layers between input layer and output layer are called hidden layers.
A neuron is nothing else than a mathematical function, which processes all incoming data (e.g. summarises), takes into account a weight (factor) of the respective input and decides on the basis of the function result by means of another function, what it forwards as output of the neuron.
The functions used as well as the number, size and interconnectedness of the layers are determined by the so-called network topology.
The weights of the connections, on the other hand, are parameterized. The optimal adjustment of the weights is the subject of training the neural network in the so called learning phase.