EarthWeb   
HomeAccount InfoLoginSearchMy ITKnowledgeFAQSitemapContact Us
     

   
  All ITKnowledge
  Source Code

  Search Tips
  Advanced Search
   
  

  

[an error occurred while processing this directive]
Previous Table of Contents Next


Thus the standard normalization condition occurring in FCM is replaced by the involvement (conditioning) constraint. The optimization problem is now reformulated accordingly (Pedrycz, 1996)


minU, v1, v2, …, vc Q

subject to


Again the minimization of the objective function is carried out iteratively where

We arrive at the above formula by transforming the minimization problem into a standard unconstrained optimization by making use of Lagrange multipliers and determining a critical point of the resulting function. The computations of the prototypes are the same as for the original FCM method. Moreover, the convergence conditions for the method are the same as discussed for the original FCM algorithm.

The context has a dominant effect on the performance of the clustering mechanism. If f < f’ then the population of the patterns involved in grouping and placed under context f’ is lower.

One should indicate that the membership values of contexts do not sum up to 1; a similar phenomenon can be witnessed in possibilistic clustering (Krishnapuram and Keller, 1993) and clustering with noise cluster (Dave, 1992). However the origin of these two departures from the original constraint is completely different.

The computations of their membership functions result directly from the FCM model. In particular,

j= 1, 2, …, c, with the same distance function as encountered in the original method.

The neural network is developed by starting with data mining of experimental data. The development of the model hinges on a specification of a series of contexts (fuzzy sets) defined for the dependent variable. Denote these by . In particular, these fuzzy sets are normal and overlap to some extent. For each context, say , we run the FCM method and form the rule-based models with the rules

where Ω1, Ω2, …, Ωc are the regions in the input space that are centered around the “c” prototypes constructed by the clustering method. Thus the complete model is designed by considering all contexts, , i=1, 2, …, c. Denote these contexts (specified here as fuzzy sets) by and associate with each of them a collection of induced clusters (obtained by running the clustering scheme)

The architecture of the model, Fig. 6.21, is highly modular and is developed around the clusters implied by the individual contexts. By changing the contexts (let us stress that this facet of modeling is totally under the control of the model’s designer), one produces different clusters that may lead to significantly distinct neural networks.


Figure 6.21  Cluster-based design of neural networks

The activation levels of the contexts are aggregated additively yielding

(here the addition and multiplication are completed for fuzzy numbers rather than plain numeric quantities). Interestingly enough, the output of the model becomes a fuzzy set (Y) while the output unit can be regarded as an artificial neuron with fuzzy connections. This comes as an interesting example justifying an existence of neurons with fuzzy connections. One can eventually proceed with a reduced (simplified) version of the model replacing the weights of the summation node by the modal values of the corresponding contexts. The number of clusters can be adjusted by pruning those groups whose connections associated with the neurons are the weakest (when evaluated in terms of absolute values of the magnitudes).

Another option of exploring the already gained clustering information is to use the membership values in the minimized performance index. In other words, some patterns become discounted in the minimized performance index that guides the learning of the neural network. As a performance index we introduce a weighted sum of errors defined with the help of the clustering results

where ξ(x(k)) depends on the obtained membership values. In particular, we define it as

Thus if the pattern is very much “borderline” in terms of its membership grades

the above performance index discounts this particular data point.

6.8. Linguistic interpretation of neural networks

An important role of fuzzy sets arising in the setting of neurocomputing concerns interpretation abilities added to neural networks. This is evidently an area where fuzzy sets play an important role. We discuss two interpretation algorithms. The first is geared toward feedforward neural networks and rules elicitation. The second one discusses an interpretation of self-organizing maps.

6.8.1. From neural networks to rule-based systems

As becomes predominantly emphasized in the literature, neural networks are black boxes. Their learning abilities are the most important design asset. The distributed character of processing therein also contributes to difficulties in the development of the explicit format of knowledge acquired by the network during its learning activities. Therefore it becomes of paramount interest to interpret main relationships approximated by the network in an explicit format., The input - output interpretation of any network requires a determination of interesting input - output dependences. In particular, these could be rules. To make the rules meaningful, we should underline that the granularity of antecedents (conditions) and consequents (conclusions) should be expediently selected. The purely numeric rules, say

do not make too much sense. The numeric quantification standing there makes the rules extremely specific and overly brittle. On the other hand, the rules including linguistic terms do make a lot of sense, being more general and easy to interpret. Let us then define a number of linguistic landmarks in the space of the inputs and outputs of the network, Fig. 6.22.


Figure 6.22  Derivation of “if - then” statements with linguistic landmarks (fuzzy sets)


Previous Table of Contents Next

Copyright © CRC Press LLC

HomeAccount InfoSubscribeLoginSearchMy ITKnowledgeFAQSitemapContact Us
Products |  Contact Us |  About Us |  Privacy  |  Ad Info  |  Home

Use of this site is subject to certain Terms & Conditions, Copyright © 1996-2000 EarthWeb Inc. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Read EarthWeb's privacy statement.