![]() |
|||
![]()
|
![]() |
![]() |
![]() |
2. BACKGROUND2.1. THE DEVELOPMENT OF LOGICThe overwhelming means of linguistic representation for relations of objects, actions, patterns, situations, and events were a very early development of human thinking. This line was continued by computer representation, and in a not too different way. This syllogistic-linguistic method was formulized by Aristotle nearly 24 centuries ago, and is continuously being refined so as to be able to express delicate relations of situation patterns. Linguistics, philosophy, mathematics, and recently computer science join these efforts. Those who devote efforts to read the original works of the Greek philosophers, Aristotle, the thinkers of the Megara school and the Stoics, later the Arabic, French, and English authors of the Middle Ages will be surprised at how many ideas can be found there that are considered to be the achievements of computer science and of recent decades only. A new age, starting with Leibniz in the 17th century and followed by the greatest mathematicians and philosophers of the 18th and 19th centuries (like Boole, Frege, and Babbage) led in a direct way to modern developments of logic, to Gödel, Tarski and to the immediate predecessors of computer science, like Turing, Church, Neumann. The Leibnizian line prepared a more precise mathematical notation, representation of different conceptual relations, which could be easily converted into machine representations. The Boolean line prepared the techniques for the manipulation of conceptual formulae, used in practically every basic computer development of calculations. The latest progress, now bounded with computer science, can be characterized by efforts to bridge the gap between the closed world of classical logic and the open world of reality. The beautiful game of logic could be perfect until it was used for real-life decision support. Every concept was well-defined, unambiguous, suited to the celebrated law of the excluded third (tertium non datur). Real life is not like this; it is full of contradictions, ambiguities, changing definitions, and interpretations. After the ancient limitations of logic to certain conditions by modal logic, intensional logic considered the interpretations, nonmonotonic logic the contradictions, and situation-, discourse-logic the personal-social conditions of a communication. Dependency of time was formulated by temporal logic, a kind of modality. The concept of different possible worlds, originated from the 17-18th century, received a practical computer representational form. A discussion of these developments follows. Here we mention, among many others, the names of Wittgenstein, Montague, Kripke, and McCarthy. The internal looking problems of logical closeness, provability, computability, and the philosophical mysteries of the infinite and infinitely complex world and worlds are intrinsically related but not solved and possibly never will be. The builder of an expert system designs the knowledge representation with the conscious critics of related compromises. Chains of concepts and events are represented by the frame idea of Minsky and the semantic nets of Schank, describing scripts of events. The object-oriented direction is the most important computer programming representation of those developments. 2.2. UNCERTAINTYThe science of uncertainty is much younger. Aristotle made a remark that he did not deal with it because it has no science. Really, it was only in the 17th century when the science of uncertainty started, first and long related to dice and card games, and this start defined the ways of thinking about uncertainty until present times, the model of well-defined single events and only the multiplicity of those being uncertain dominated entirely classical probability and its related statistics. Pascal and Leibniz should be mentioned when speaking about origin. Bayes was one of the first to draw relations between uncertainty and logical reasoning. At the turn of the 19th and 20th centuries, statistical thermodynamics gave a new impulse to the theory by new models of the motion and energy relations of elementary particles. Though Boltzman is more recognized by physics, he can be considered the originator. The story continued with quantum theory. The edifice of classical probability was nearly completed in the first part of this century by Mises, Kolmogorov, and others. Nearly, because relevant additions are still ongoing, both in statistics and probability, especially related to the theory of information; the key name here is Shannon. The other stimulating application area was economy. In all these latter applications, the behavior of the singular events of the model deviated from the game model; they were uncertain themselves. Applications where human judgement played a relevant role, like in economy, medical diagnosis, social choice opened the eyes onto the contradictions of these judgements contrasted to simple event algebra models. The idea of evidence estimation originates also in the past century; the paper of Venn is considered the first deep presentation of the problem. This was followed by Black, Ramsay, de Finetti, Savage, Wald in the 1930s and 1940s, and later by the school of Tversky in psychology. Subjective probability tried and still makes efforts to integrate these facts of subjective judgement into the framework of classical probability. Others go further; Dempster and Shafer created a somehow different model for the calculation of combination and propagation of uncertain events. The fuzzy concept of Zadeh formulates a new model of uncertain set memberships and direct transfer of verbal estimations to numerical computer representations. Several other attempts try to bridge classical concepts and all kinds of uncertainties of the reality and our knowledge and expertise. Though many of them try to seclude from the frequency views, this relates only to the reliance on formal statistics. Every human judgement is based on some kind of frequency experience: partly of others, partly their own. For that reason, this remains a hot problem, and probably never to be solved in a final procedural way. Knowledge about the uncertainty of the uncertainty methods is a basic professional and ethical ingredient of all people who design or apply expert systems. This disciplinary and viewpoint variety reflects also the slowly emerging perception of the very different natures of uncertainty: due to temporary lack of knowledge, not yet succeeded discovery, not computable complexity, subjectivity, conflicting values and interests, imprecision of definitions and measurements, etc. The best sourcebook of the history of logic until the advent of recent developments is of Kneale and Kneale 1960, the same about uncertainty of Hacking 1975, some later advancements are found in Vámos 1991.
|
![]() |
|
Use of this site is subject certain Terms & Conditions. Copyright (c) 1996-1999 EarthWeb, Inc.. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of EarthWeb is prohibited. Please read our privacy policy for details. |