EarthWeb   
HomeAccount InfoLoginSearchMy ITKnowledgeFAQSitemapContact Us
     

   
  All ITKnowledge
  Source Code

  Search Tips
  Advanced Search
   
  

  

[an error occurred while processing this directive]
Previous Table of Contents Next


2.9. Show that a perceptron architecture cannot handle the standard Exclusive-OR problem. The corresponding training set for a two-dimensional case is given in the form

x1 x2 y
0 1 1
0 0 0
1 0 1
1 1 1


Hint:  analyze the formulas of the perceptron resulting in this case and show that these are essentially conflicting equations.

2.10. The performance index Q(w) = wT Aw with w = [w1 w2] is quadratic with A equal to

Plot Q(w) in a two-dimensional space of the connections (w) and elaborate on the learning process adhering to the standard Newton’s algorithm. Then repeat the same analysis for another matrix A,

Could you notice any differences in the speed of the learning? If so, why did they occur?

2.11. What Boolean function does the neuron with a threshold nonlinearity, Fig. 2.20, realize? The threshold T is equal to 1/2.


Figure 2.20  A neuron with a threshold nonlinearity

2.12. How would you suggest to distribute RBFs in the two-dimensional classification problem, Fig. 2.21. What type of the receptive fields would you suggest? Why? What about the data in Fig. 2.19?


Figure 2.21  Two class pattern recognition problems

2.13. Contrast the on-line and off-line learning in neural networks.

2.14. What would be the number of connections in the network composed of 3 hidden layers each of them having 1,000 neurons that is fully connected (viz. all the neurons in two successive layers are connected)? The size of the input and output layer is 2, 000.

2.15. Show that the gradient of the performance index with the Hamming distance can be expressed as

where the sign function equal +1 for positive numbers and -1 for the negative ones.

2.10. References

1.  M. Bianchini, P. Frasconi, M. Gori, Learning without local minima in radial basis function networks, IEEE Trans. on Neural Networks, 6, 1995, 749 - 757.
2.  S. Chen, C.F.N. Cowan, P. M. Grant, Orthogonal least squares learning algorithm for radial basis function networks, IEEE Trans. on Neural Networks, 2, 1991, 302 - 309.
3.  G. Cybenko, Approximation by superpositions of a sigmoidal function, Mathematical Control Signals Systems, 2, 1989, 303-314.
4.  E. Hartman, J. D. Keeler, J. Kowalski, Layered neural networks with gaussian hidden units as universal approximators, Neural Computation, 2, 1990, 210 - 215.
5.  M. H. Hassoun, Fundamentals of Artificial Neural Networks, MIT Press, Cambridge, MA, 1995.
6.  D. Hebb, The Organization of Behavior, J. Wiley, New York, 1949.
7.  R. Hecht-Nielsen, Neurocomputing, Addison-Wesley, Reading, MA, 1990.
8.  K. Hornik, Some new results on neural network approximation, Neural Networks, 6, 1993, 1069 - 1072.
9.  J. A. Leonard, M. A. Kramer, Radial basis function networks for classifying process faults, IEEE Control Systems Mag. 11(3), 1991, 31 - 38.
10.  T. Kohonen, Self-Organization and Associative Memories, Springer-Verlag, Berlin, 1984.
11.  B. Kosko, Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence, Prentice Hall, Englewood Cliffs, NJ, 1992.
12.  T. Poggio, F. Girosi, Networks for approximation and learning, Proc. of the IEEE, 9, 1990, 1481 - 1497.
13.  B. Polyak, Introduction to Optimization, Optimization Software Inc., New York, 1987.
14.  F. Rosenblatt, Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms, Spartan Press, Washington, 1961.
15.  D.E. Rumelhart, J.L. McClelland, Parallel Distributed Processing: Exploration in the Microstructure of Cognition, vol.1, MIT Press, Cambridge, MA, 1986.
16.  D.E. Rumelhart, D. Zipser, Feature discovery by competitive learning, Cognitive Science, 9, 1985, 75 - 112.
17.  H. G. Schuster, Deterministic Chaos, Physik-Verlag, Weinheim, 1984.
18.  D.A. Speecher, A universal mapping for Kolmogorov’s superposition theorem, Neural Networks, 6, 1993, 1089 - 1094.
19.  G.S. Stent, A physiological mechanism for Hebb’s postulate of learning, Proc. of the National Academy of Sciences, USA, 70, 1973, 997 - 1001.
20.  B. Widrow, M.E. Hoff, Adaptive switching circuits, IRE Western Electric Show and Convention Record, 1960, part 4, 96 - 104.


Previous Table of Contents Next

Copyright © CRC Press LLC

HomeAccount InfoSubscribeLoginSearchMy ITKnowledgeFAQSitemapContact Us
Products |  Contact Us |  About Us |  Privacy  |  Ad Info  |  Home

Use of this site is subject to certain Terms & Conditions, Copyright © 1996-2000 EarthWeb Inc. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Read EarthWeb's privacy statement.