Mathematical Models of Neural Networks: Bridging the Gap between Biology and Computation

Authors

  • Patel Nirmal Rajnikant Pacific Academy of Higher Education University, Udaipur, Rajasthan, India Author
  • Dr. Ritu Khanna Professor and Faculty of Engineering, Pacific Academy of Higher Education and Research University, Udaipur, Rajasthan, India Author

DOI:

https://doi.org/10.32628/IJSRST251222675

Keywords:

Deep Learning, Hebbian Learning, Neuromorphic Computing, Biological Neural Systems Artificial Neural Networks (ANNs), Brain-Inspired Computation, Leaky Integrate-and-Fire Model, Spiking Neural Networks (SNNs)

Abstract

The study of neural networks stands at a vibrant crossroads between biological insight and computational innovation. This paper explores mathematical models that faithfully capture the dynamics of biological neural systems while enabling powerful computational architectures. We examine foundational models such as the Hodgkin-Huxley and integrate-and-fire neurons, progressing to modern abstractions like spiking neural networks and deep learning frameworks. Emphasis is placed on how mathematical formalism can both illuminate biological processes and inspire new algorithms. By bridging the disciplines of neuroscience and computer science, we highlight how theoretical models not only deepen our understanding of brain function but also fuel the next generation of intelligent machines. Challenges in model fidelity, scalability, and interpretability are discussed, along with emerging approaches that promise to harmonize biological realism with computational efficiency. This synthesis of biology and computation paints a future where the brain and the machine are partners in innovation, not rivals.

Downloads

Download data is not yet available.

References

McCulloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. The Bulletin of Mathematical Biophysics, 5(4), 115–133. https://doi.org/10.1007/BF02478259

Rosenblatt, F. (1958). The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review, 65(6), 386–408. https://doi.org/10.1037/h0042519

Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning representations by back-propagating errors. Nature, 323(6088), 533–536. https://doi.org/10.1038/323533a0

Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press. https://www.deeplearningbook.org/

Dayan, P., & Abbott, L. F. (2001). Theoretical neuroscience: Computational and mathematical modeling of neural systems. MIT Press.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. https://doi.org/10.1038/nature14539

Hassabis, D., Kumaran, D., Summerfield, C., & Botvinick, M. (2017). Neuroscience-inspired artificial intelligence. Neuron, 95(2), 245–258. https://doi.org/10.1016/j.neuron.2017.06.011

Schmidhuber, J. (2015). Deep learning in neural networks: An overview. Neural Networks, 61, 85–117. https://doi.org/10.1016/j.neunet.2014.09.003

(A detailed overview of deep learning history, including biological inspirations and mathematical underpinnings.)

Eliasmith, C., & Trujillo, O. (2014). The use and abuse of large-scale brain models. Current Opinion in Neurobiology, 25, 1–6. https://doi.org/10.1016/j.conb.2013.10.007

(Discussion of how mathematical models scale up to large, brain-like architectures.)

Marblestone, A. H., Wayne, G., & Kording, K. P. (2016). Toward an integration of deep learning and neuroscience. Frontiers in Computational Neuroscience, 10, 94. https://doi.org/10.3389/fncom.2016.00094

(How neuroscience and machine learning models can benefit from each other!)

Sejnowski, T. J. (2020). The unreasonable effectiveness of deep learning in artificial intelligence. Proceedings of the National Academy of Sciences, 117(48), 30033–30038. https://doi.org/10.1073/pnas.1907373117

(How deep learning, rooted in brain-like models, performs beyond expectation.)

Hassabis, D., & Maguire, E. A. (2007). Deconstructing episodic memory with construction. Trends in Cognitive Sciences, 11(7), 299–306. https://doi.org/10.1016/j.tics.2007.05.001

(Inspiration for memory models in neural networks!)

Downloads

Published

25-04-2025

Issue

Section

Research Articles