Brought to you by EarthWeb
IT Library Logo

Click Here!
Click Here!

Search the site:
 
EXPERT SEARCH -----
Programming Languages
Databases
Security
Web Services
Network Services
Middleware
Components
Operating Systems
User Interfaces
Groupware & Collaboration
Content Management
Productivity Applications
Hardware
Fun & Games

EarthWeb Direct EarthWeb Direct Fatbrain Auctions Support Source Answers

EarthWeb sites
Crossnodes
Datamation
Developer.com
DICE
EarthWeb.com
EarthWeb Direct
ERP Hub
Gamelan
GoCertify.com
HTMLGoodies
Intranet Journal
IT Knowledge
IT Library
JavaGoodies
JARS
JavaScripts.com
open source IT
RoadCoders
Y2K Info

Previous Table of Contents Next


All rules were kept in a separate rule base. The inference engine would load rules into a rule-interpreter that would determine a match between the existing input data or intermediate conclusion and the clauses of the rules. Given a match, the rule would "fire," producing a new conclusion with a particular uncertainty.

PROSPECTOR explored plausible reasoning to a much greater extent than its successors. It introduced odds as the basic element. Odds can be expressed by means of Bayesian statistics:

O(H|E) = LS * O(H), where O(H) = P(H)/(1 - P(H))

with the nomenclature given as follows:

O = odds
H = a given hypothesis
E = a given evidence
LS = measure of sufficiency
P(H) = probability that a hypothesis is true

The numbers given in the hypothesis part of the rule above represent the measures of sufficiency and measures of necessity, respectively. Both of them are likelihood expressions.

LS = P(E|H)/P(E|~H) prescribes a numerical value for updating the probability and thus odds on the hypothesis H given that the evidence is observed or present. If LS is large for a particular rule, it means that the observation of E is encouraging for H. If LS is low, the observation of E is discouraging for the hypothesis.

LN = P(~E|H)/P(~E|~H) expresses the measure of necessity. A very low value for LN means that the absence of the prescribed evidence encourages the hypothesis, while the opposite implies that a hypothesis is weakened if that particular evidence is missing.

The measures given provide some metric on the necessary availibility and importance of a set of evidence in order to reach a particular conclusion. One rule may challenge a settlement despite a vast amount of evidence in favor if, among the existing data, there exists a single evidence that greatly opposes it. The different rules were organized in an inference network.

The developers behind PROSPECTOR pioneered the use of fuzzy inferencing by applying fuzzy logical relations to the suite of clauses supporting a hypothesis. The propagated evidential strength for a set of conjunctive clauses would be the minimum. For a set of disjunctive clauses, the propagated value would be the maximum value among them.

The major architectural features characterizing the system can be summarized as follows:

  1. Rule networks for expressing judgmental knowledge
  2. Semantic networks for expressing the meaning of the propositions employed in the rules
  3. Taxonomic networks for representing static knowledge about the relations among the terms in the domain

These features were exploited when Reboh built and added the knowledge acquisition system KAS (Reboh 79) to PROSPECTOR to reduce the burden on the knowledge engineers. KAS is a network editor that allows the user to add nodes to expand and modify the knowledge structures in PROSPECTOR in a fairly straight forward manner. A network of rules is illustrated in Figure 1.

2.3. DIPMETER ADVISOR

Davis et al.
MIT/Schlumberger-Doll Research Centre/Fairchild Labs for AI Research, 1981

A detailed description of DIPMETER ADVISOR can be found in the account of Davis et al. (Davis 81) and that of Baker a few years later (Baker 84). Patterned according to the first expert systems like MYCIN and PROSPECTOR, the most interesting thing beyond the technical issues addressed was the fact that DIPMETER ADVISOR was brought to full commercialization. In 1984, it had established its position in the Schlumberger organization on a basis similar to that of other computer programs in routine use. Not many expert systems had managed to reach acknowledgment at that level before 1985.


FIGURE 1 Combining rules into networks.

DIPMETER ADVISOR takes data from dipmeters that are lowered into bore holes. The data are plotted on logs. These are long sheets of paper with records of how different kinds of energy such as sonic, nuclear, and electrical energy interact with the geological formation surrounding the bore hole. The expert system attempts to emulate a special type of interpretation knowledge used to determine the nature of the formation.

The original version of the system was purely rule based, like MYCIN and PROSPECTOR. However, it underwent significant developments. Many of the changes were imposed due to new understanding of the knowledge engineering work. But it was also influenced by the rapid changes in the log domain itself. This called for greater flexibility and posed problems that remain partly unresolved today, namely, how to cope with a knowledge engineering effort where the domain knowledge undergoes continued evolution.

Eventually, the DIPMETER ADVISOR came out as a hybrid system. Knowledge of the qualitative aspects of the log interpretation task, an interacting set of different knowledge bodies, including expertise on mineralogy, petrophysics, geology, the physics of the behavior of the individual logging and drilling tools themselves, were incorporated. This was again coupled with numerical approximation techniques, algebra, and calculus.

In order to handle the diversity of this knowledge, several means of knowledge representation were called upon. One paramount aspect that DIPMETER ADVISOR pioneered was the use of "deeper" knowledge to augment the heuristics emphasized in the beginning. Deep knowledge was represented in terms of causal and aggregate structures. As static knowledge, this was supported by frame structures.

The control of the system was flexible. Meta-programming was introduced in the form of a goal-directed module. This module incorporated a model of the problem-solving process itself, making it possible for DIPMETER ADVISOR to reason about its own actions and pursue and shift objectives in accordance with the state of the problem-solving process itself. This contrasted with the early initiative that looked much the same as that of PROSPECTOR where the goals are fixed and blended with the core knowledge. The meta-module would rationalize the endeavors of DIPMETER ADVISOR and reduce the chance of pursuing "dead" leads. A simple illustration of this concept is given in Figure 2.


FIGURE 2 Conceptual description of the goal-directed meta-operation.


Previous Table of Contents Next

footer nav
Use of this site is subject certain Terms & Conditions.
Copyright (c) 1996-1999 EarthWeb, Inc.. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Please read our privacy policy for details.