Modular Neural Networks Chronicles in Biological Aspects

Authors

  • Mohseena Thaseen  Department of computer science and information technology, Nanded Education Society’s Science, Nanded Nanded, Maharashtra, India

Keywords:

Modular Neural Network, Architechture, Evolutionary Approach, Connectionist Approach, Weightless Logical Approach.

Abstract

Modular Neural Network is a one of the model of artificial neural networks. This manuscript describes the urge of modular neural network (MNN) and how it can be applied in the biological aspects, since all the cell functions are modular in nature and can be applied in all most all cell structures and function through The Connectionist Approach and The Weightless Logical Approach for best optimized error.

References

  1. MICHAEL W. EYSENCK. Principles of Cognitive Psychology. LEA. Hove 1993.
  2. Micheli-Tzanakou 1987, \A neural networkmodel of the vertebrate retina," in 9th Ann. Conf.IEEE Engineering in Medicine and Biology Society (Boston, MA, USA), November 13{16.
  3. Cave, J. Rueckl and S. Kosslyn 1989, \Why arewhat and where processed by separate cortical visualsystems? A computational investigation,"J. Cognitive Neuroscience1(2), 171{186.
  4. Happel and J. Murre 1994, \Design and evolution of modular neural network architectures,"Neural Networks7(6/7), 895{1004.
  5. Brashers-Krug, Reza Shadmehr and E. Todorov1994, \Catastrophic interference in human motorlearning," inNeural Info. Proc. Syst. { Nat. Synth.(Denver, USA), pp. 18{26.
  6. Otto, E. Guigon and Y. Burnod 1990, \Cooperation between temporal and parietal networks for invariant recognition," inProc. Int. Neural NetworkConf., Vol. 1 (Paris, France), pp. 480{483.
  7. Mundkur and U. Desai 1991, \Automatic target recognition using image and network decomposition," in1991 IEEE Int. Joint Conf. Neural Networks(Singapore), pp. 281-286..
  8. Jimenez 1997, \Locating anatomical landmarks for prosthetics design using ensemble neural networks," inIEEE Int. Conf. Neural Net-works (ICNN97), Vol. 1 (Houston, Texas, USA),pp. 81-85.
  9. A.G. Nijhuis, A. Broersma en L. Spaanenburg, “A modular neural network classifier for the recognition of occluded characters in automatic license plate reading”, to be presented at FLINS’02 (Gent, September 2002).
  10. LAURENE FAUSETT. Fundamentals of Neural Networks. Arichtectures, Algorithms, and Applications. Prentice-Hall. New Jersey 1994.
  11. BART L.M. HAPPEL & JACOB M.J. MURRE. Design and Evolution of Modular Neural Architectures. Neural Networks, Vol. 7, No. 6/7, Pages 985-1004. 1994.
  12. P. Wagner, J. Mezey, and R. Calabretta, “Natural selection and the origin of modules,” in Modularity: Understanding the Development and Evolution of Natural Complex Systems, W. Callebaut and D. Rasskin-Gutman, Eds. MIT Press, 2005.
  13. Lipson, J. Pollack, and N. Suh, “On the origin of modular variation,” Evolution, vol. 56, no. 8, pp. 1549–1556, 2002.
  14. Kashtan and U. Alon, “Spontaneous evolution of modularity and network motifs,” Proceedings of the National Academy of Sciences, vol. 102, no. 39, pp. 13 773–13 778, 2005.
  15. Gruau, “Automatic definition of modular neural networks,” Adaptive Behaviour, vol. 3, no. 2, pp. 151–183, 1995.
  16. Hornby and J. Pollack, “Creating high-level components with a generative representation for body-brain evolution,” Artificial Life, vol. 8, no. 3, pp. 223–246, 2002.
  17. -B. Mouret and S. Doncieux, “Mennag: a modular, regular and hierarchical encoding for neural-networks based on attribute grammars,” Evolutionary Intelligence, pp. 187–207, 2008.
  18. Doncieux and J.-A. Meyer, “Evolving modular neural networks to solve challenging control problems,” in Proceedings of the Fourth International ICSC Symposium on engineering of intelligent systems (EIS 2004), 2004
  19. Fisher J, Henzinger TA. Executable cell biology. Nat Biotechnol 2007;25:1239–49.
  20. Spaanenburg, L., Jansen, W.J. and Nijhuis, J.A.G. (1997), “Over multiple ruleblocks to modular nets,” Proceedings Euromicro’97, pp. 698-705.
  21. Spaanenburg, L. (2000), “Knowledge fusion in modular neural networks,” Proceedings of the NC2000, pp. 356-362.
  22. Evolving modular neural-networks through exaptation Jean-Baptiste Mouret St´ephane Doncieux
  23. A. Jacobs, M. I. Jordan, S. J. Nowlan and G. E. Hinton, "Adaptive Mixtures of Local Experts", Neural Computation, vol. 3, pp. 79-87, 1991.
  24. CHENG-CHIN CHIANG AND HSIN-CHIA FU.Divide-and-conquer methodology for modular supervised neural network design. In: Proceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7). Pages 119-124. Orlando, FL, USA 1994.
  25. I. Jordan and R. A. Jacobs. A competitive modular connectionist architecture. In Advances in Neural Information Processing Systems 3, pages 767–773, San Maeto, CA, 1991. Morgan Kaufmann Publisher Inc.
  26. N. Osherson, S. Weinstein, and M. Stoli. Modular learning. Computational Neuroscience, pages 369–377, 1990.
  27. I. Jordan and R. A. Jacobs. Task decomposition through competition in a modular connectionist architecture: The what and where vision tasks. Cognitive Science, 15:219– 250, 1991.
  28. Haykins Simon. Neural Networks, A comprehensive Foundation. Macmillan College Publishing Company, New York, NY, 1994.
  29. J. Sharkey. On combining artificial neural networks. Connection Science, 8(3& 4):299–313, 1996.
  30. J. Sharkey. On combining artificial neural networks. Connection Science, 8(3& 4):299– 313, 1996.
  31. J. Sharkey. Modularity, combining and artificial nueral nets. Connection Science, 9(1):3–10, 1997.
  32. J. Nilsson. Learning Machines: Foundations of Trainable Pattern-Classifying Systems. McGraw Hill, New York, 1965.
  33. Clemen. Combining forecasts: A review and annotated bibliography. International Journal of Forecasting, 5:559–583, 1989.
  34. A. Barnett. Computational methods for a mathematical theory of evidence. In Proceedings of IFCAI, pages 868–875, 1981.
  35. W. J. Granger. Combining forecasts-twenty years later. International Journal of Forecasting, 8, 1989.
  36. Hartline 1988, \Models for simulation of real neural networks, in Int. Neural Network Soc. (INNS) First Ann. Meeting (Boston, USA), p. 256.
  37. Gray, A. Michel and W. Porod 1988, \Application of neural networks to sorting problems," in Int. Neural Network Soc. (INNS) First Ann. Meeting (Boston, USA), p. 441.
  38. Anzai and T. Shimada 1988, \Modular neural networks for shape and/or location recognition," in Int. Neural Network Soc. (INNS) First Ann. Meeting (Boston, USA), p. 158.39. R. E. Schapire. Strength of weak learners. Machine Learnability, 5:197–227, 1990.
  39. Freund and R. Y. Schapire. Experiments with a new boosting algorithm. In Machine Learning: Proceedings of the Thirteenth International Conference, pages 148–156, 1996.
  40. A. Jacobs. Methods of combining experts’ probability assessments. Neural Computation,7:867–888, 1995.
  41. Xu, A. Krzyzak, and C. Y. Suen. Methods of combining multiple classifiers and their applications to handwriting recognition. IEEE Transactions on Systems, Man and Cybernetics, 22:418–435, 1992.
  42. Hashem and B. Schmeiser. Approximating a function and its derivatives using mseoptimal linear combination of trained neural networks. In Proceedings of the World Congress on Neural Netwroks, volume 1, pages 617–620, 1993.
  43. Perrone and L. N. Cooper. When networks disagree: Ensemble methods for hybrid neural networks. In Neural Networks for Speech and Image Processing. Chapman and Hall, London, 1993.
  44. K. Hansen and P. Salamon. Neural networks ensembles. IEEE Transactions on Pattern Analysis ans Machine Intelligence, 12:993–1000, 1990.
  45. Al-Ghoneim and V. Kumar. Learning ranks with neural networks. In Applications and Science of Artificial Neural Networks, Proceedings of the SPIE, volume 2492, pages 446–464, 1995.
  46. G Rogova. Combining results of several neural network classifiers. Neural Networks, 7:771–781, 1994.
  47. H. Wolpert. Stacked generalization. Neural Networks, 5:241–259, 1992.
  48. Breiman. Bagging predictors. Machine Learning, 24:123–140, 1996.
  49. MUI, A. AGARWAL, A. GUPTA, P. SHEN-PEI WANG. An Adaptive Modular Neural Network with Application to Unconstrained Character Recognition. In: International Journal of Pattern Recognition and Artificial Intelligence, Vol. 8, No. 5, Pages 1189-1204. October 1994.
  50. O. HEBB The Organization of Behavior. John Wiley & Sons. New York 1949.
  51. ROSENBLATT The Perceptron: a Probabilistic Model for Information Storage and Organization in the Brain. In: Psychological Review, Vol. 65. Pages 386-408. 1958
  52. Auda and M. Kamel. Modular neural networks classifiers: A comparative study. Journal of Intelligent and Robotic Systems, 21:117–129, 1998.
  53. Bartfei. Hierarchical clustering with art neural networks. In World Congress on Computational Intelligence, volume 2, pages 940–944, Florida, USA, 1994.
  54. Kivinen and M. K. Warmuth. Exponentiated gradient versus gradient descent for linear predictors. Information and Computation, 132(1):1–63, 1997.
  55. N. Cesa. Analysis of two gradient-based algorithms for on-line regression. JournaL of Computer and System Sciences, 59(3):392–411, 1999.
  56. Precup and R. S. Sutton. Exponentiated gradient methods for reinforcement learning. In Proceedings of the 14th International Conference on Machine Learning, pages 272–277.Morgan Kaufmann, 1997.
  57. Waibel, H. Sawai, and K. Shikano. Modularity and scaling in large phonemic neural networks. IEEE Transactions on Acoustics, Speech, and Signal Processing, 37(12):1888–97, December 1989.

Downloads

Published

2016-08-30

Issue

Section

Research Articles

How to Cite

[1]
Mohseena Thaseen, " Modular Neural Networks Chronicles in Biological Aspects , International Journal of Scientific Research in Science and Technology(IJSRST), Online ISSN : 2395-602X, Print ISSN : 2395-6011, Volume 2, Issue 4, pp.208-213, July-August-2016.