![]() |
|
|||
![]() |
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() |
|
![]() |
[an error occurred while processing this directive]
Some example characteristics of the referential neurons are portrayed in Fig. 7.13.
To model complex situations, the referential neurons can be encapsulated into a form of a neural network. An example is a tolerance neuron which consists of DOMINANCE and INCLUSION neurons placed in the hidden layer and a single AND neuron in the output layer, Fig. 7.14.
The above neuron generates a tolerance region as shown in Fig. 7. 15.
The ordinal sum of t-norms can be used in another model of a fuzzy neuron with more diversified processing occurring at the level of synaptic processing. Assume that the number of t-norms as well as their type have been fixed. We leave, however, the ranges of the individual t-norms variable and regard them as adjustable parameters of the neuron. In a concise form this reads as where a and b are the vectors of the ranges of the contributing t - norms. 7.4. Learning in fuzzy neural networksThe issue of learning embraces a number of essential components. First, it is a question of learning with respect to the level of available supervision. The second is the question whether we deal with structural or parametric changes of the network or whether we can work with both of them. We commence with these issues and then focus on a number of case studies that illustrate several learning procedures. 7.4.1. Learning policies for parametric learning in fuzzy neural networksThe learning of the fuzzy neural network embraces a number of diverse scenarios. It usually heavily depends on the initial information available to the problem which can be immediately accommodated in the network. For instance, in many situations it is obvious in advance that some connections will be weak or even nonexistent. This allows us to build an initial configuration of the network that is very divergent from a fully connected network. This initial knowledge tangibly enhances the learning procedure, eliminating a need to modify all the connections of the network and thus preventing us from proceeding with learning from scratch. On the other hand, if the initial domain knowledge about the problem (network) is not sufficient, then a fully connected structure yielding higher values of its entropy function (Machado and Rocha, 1990; Rocha, 1992) would be strongly recommended. In many cases the role of the individual layers is also obvious so that one can project the behavior of the network (and evaluate its learning capabilities) with this respect. The following two general strategies of learning are worth pursuing:
7.4.2. Performance indexOne can easily envision a number of different ways in which the similarity (distance) between the outputs of the neural network and the target fuzzy sets can be expressed. While the Euclidean distance is common, we confine ourselves to the equality index. There are two reasons behind studying this option. First, the equality index retains a clear logic set-oriented interpretation that could be of particular interest in the setting of fuzzy neural networks. Secondly, with a suitable selection of the underlying t-norms, it may assure some robustness features. Let x and y denote two membership values, x, y ∈ [0, 1]. Let us recall that the equality index computes a level of similarity between these two elements in the following way Owing to the robustness aspects we are interested in, it is instructive to concentrate on the Lukasiewicz implication of the form After simple calculations we derive the following expression The equality index becomes a piecewise linear function of the arguments. Evidently, The equality index can be rewritten in a slightly different manner by observing that both 1 -x + y and 1 - y + x can be combined into a single expression 1 - |x-y| that is nothing but a complement of the Hamming distance between x and y. Thus It is also visible that the Lukasiewicz-based equality index as based on the L1 metric promotes significant robustness properties of the ensuing fuzzy neural network.
Copyright © CRC Press LLC
![]() |
![]() |
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() |
![]() |