EarthWeb   
HomeAccount InfoLoginSearchMy ITKnowledgeFAQSitemapContact Us
     

   
  All ITKnowledge
  Source Code

  Search Tips
  Advanced Search
   
  

  

[an error occurred while processing this directive]
Previous Table of Contents Next


By performing analysis of this type for several linguistic data descriptors one can develop a collection of such descriptors that cover the entire data space, see Fig. 6.25.


Figure 6.25  Collection of descriptors induced by various contexts (i) coarse descriptors with overlap (ii) fine descriptors with limited overlap

We may eventually require that this collection should cover the entire map to a high extent, meaning that

where N(i, j, ) is the response of the neuron located at the (i, j) and considered (placed) in contex from a certain family of context . Some other criteria could be also anticipated; e.g., one may request that the linguistic descriptions are well-separated, meaning that their activation regions in the map are kept almost disjoint.

6.9. Hybrid fuzzy neural computing structures

When it comes to the combination of the technologies of fuzzy sets and neural networks, we can distinguish two key facets one should take into account:

  architectural
  temporal

Fig. 6.26 emphasizes two main features that are essential in establishing any vital relationship between fuzzy and neural computation. These are exemplified in the sense of plasticity and explicit knowledge representation of the resulting neuro-fuzzy structure.


Figure 6.26  Synergy between fuzzy sets and neural networks

The strength of the interaction itself can vary from the level at which the technology of fuzzy sets and neurocomputing are combined and co - exist to the highest one where there exists a genuine fusion between them.

6.9.1. Architectures of hybrid fuzzy neural systems

The essence of the architectural interaction of fuzzy sets and neural networks is shown in Fig. 6.27. To a significant extent the form of interaction is similar to that encountered in fuzzy modelling (Pedrycz, 1995), cf. Fig. 6.28.


Figure 6.27  Architectural synergy of fuzzy sets and neural networks

Fuzzy sets are more visible at the input and output layers of any multilayer structure of the network. The role of these layers is much more oriented toward capturing the semantics of data rather than focusing on pure and context-free numeric processing.


Figure 6.28  Architectural links between fuzzy sets and neural networks considered in the setting of fuzzy modelling

6.9.2. Temporal aspects of interaction in fuzzy-neural systems

The temporal aspects of interaction arise when dealing with the various levels of intensity of learning. Again the updates of the connections are much more vigorous at the hidden layers - we conclude that their plasticity is higher than the others situated close to the input and output layers. Again, this concurs with our observations that the input and output layer are associated with the predetermined activity interacting with the environment. The already established point of view is far more solid than the other connections of the network. Any changes at this level can be exercised once all other learning possibilities have been already exhausted.


Figure 6.29  Plasticity of a layer as a function of its position in the neural network

6.10. Conclusions

This chapter was devoted to the development of interaction between fuzzy sets and the technology of neural networks. The literature delivers an abundance of neurofuzzy architectures. To fully benefit from them rather than being mislead by their variety, it is important to establish a transparent taxonomy. We attempted to take such a systematic look at these approaches by identifying various levels of synergy as well as formulating the areas of interaction producing genuine enhancements of the existing neural or fuzzy systems. Fig. 6. 30 summarizes a variety of symbiotic links between fuzzy sets and neurocomputing. One should stress


Figure 6.30  Selected examples of synergy between fuzzy sets and neural networks

6.11. Problems

6.1. Analyze the role of the steepness factor in the sigmoidal nonlinearity of the neuron in the learning algorithm. Propose learning metarules in the form:

  if learning time and training time then steepness of the nonlinearity

6.2. Discuss the main differences between the possibility - necessity preprocessing layer of a neural network and the same layer implementing computations with a compatibility measure. Outline advantages and disadvantages of these two approaches.

6.3. Given is a training set of heterogeneous data (x(k), y(k)) in the form

Here stands for a triangular fuzzy number with the modal value equal “m” and two bounds defined by “a” and “b”. Propose a topology of a neural network and discuss a relevant learning algorithm. What would be an output of the network for the first input equal 4.5 and its linguistic version T(4.5, 3, 6)? Hint: the remaining inputs can be modeled as unknown.

6.4. It is obvious that a linearly nonseparable Exclusive - OR problem requires a neural network with a hidden layer. Could you avoid it by transforming the training data in a nonlinear way by using appropriate fuzzy sets or fuzzy relations?

6.5. Interpret the following small neural network, Fig. 6.31, by generating corresponding “if - then” rules. Each neuron is equipped with a threshold nonlinear element meaning that its output is equal to 1 if the weighted sum of its inputs is equal or exceeds T; otherwise the neuron is turned off (the output is zero). The threshold level is equal to 0.5.


Figure 6.31  Neural network to be interpreted


Previous Table of Contents Next

Copyright © CRC Press LLC

HomeAccount InfoSubscribeLoginSearchMy ITKnowledgeFAQSitemapContact Us
Products |  Contact Us |  About Us |  Privacy  |  Ad Info  |  Home

Use of this site is subject to certain Terms & Conditions, Copyright © 1996-2000 EarthWeb Inc. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Read EarthWeb's privacy statement.