Neural networks have been a very important area of scientific study that has evolved by different disciplines such as mathematics, biology, psychology, computer science, etc.
The study of neural networks leapt from theory to practice with the emergence of computers.
Training a neural network by adjusting the weights of the connections is computationally very expensive so its application to practical problems took until the mid-80s when a more efficient algorithm was discovered.
That algorithm is now known as back-propagation errors or simply backpropagation.
One of the most cited articles on this algorithm is:
Learning representations by back-propagating errors
Nature 323, 533 – 536 (09 October 1986)
Although it is a very technical article, anyone who wants to study and understand neural networks is obliged to pass through this material.
I share the entire article in: