Dear John, (01)
I gather you must be shown all due esteem for your energy, devotion, and
tolerance of all sorts of ideas and views imposed on you. (02)
Certainly, it would be sad if your priceless effort is to be passed away
without a summary of conclusions on the core problems of ontology. Maybe
you will sort out all the judgments by the attestation criteria: precise
(certain and definite and explicit and determinate); vague (implicit,
equivocal, intentionally ambiguous, open to several interpretations); and
confusing (unreasonable or perplexing and puzzling), alike the following
sample. (03)
EXPLICIT STATEMENTS (04)
JS > The categories of an ontology are not words.
JS > Alignment of ontologies is not alignment of their terminologies.
PC> If any two ontology-driven systems want to work together, that upper
ontology... has to be the same. If a system decides to change its upper
ontology on the fly, it should be prepared for serious chaos. (05)
MW > Upper ontologies are more general, and so will have more general
axioms. Lower level
ontologies are more specific and so will have more specific and detailed
axioms. (06)
EP > A good gloss entails the necessary and sufficient conditions of the
concept that it
describes. (07)
EP > We can certainly map general things to other general things with
precision. (08)
KE >...to *distinguish* significance (or meaning) from a glut of vagueness
and mostly useless noise was the real goal or objective of an ontology. (09)
VAGUE STATEMENTS (010)
JS> Words must be related to ontologies, but that mapping is a complex
many-to-many [or one?] relationship between the words of any natural
language and the categories of an ontology. (011)
JS > .any upper level should be as *neutral* as possible. The upper levels
should have very
few axioms. (012)
JS > A truly neutral upper level should avoid any commitment to what is
considered essential vs. what is considered accidental. (013)
CONFUSING STATEMENTS (014)
JS > the upper level is much less important than the mid and lower levels.
Don't waste more time and money on things that don't matter. (015)
LO > We explicitly choose to represent ontologies because we want a formal
representation, a >logical theory, about the things in the world (or even
possible or fictional or impossible things). Why, because logic is our best
tool for such. (016)
LO > Ontology engineering does not focus on terminology, nor on the formal
semantics of >natural language, though of course those assist in the effort.
It does not focus on epistemology, >though that too is important. It focuses
on ontology. (017)
LO > ...a series of increasingly expressive semantic models ranging from
flat terminologies to >taxonomies to thesauri to conceptual models to
logical theories. (018)
LO > If you need precise semantics for your applications or services, then
you need
logical theories, i.e., high-end or strong ontologies. (019)
With all due respect,
Azamat (020)
----- Original Message -----
From: "John F. Sowa" <sowa@xxxxxxxxxxx>
To: "ONTAC-WG General Discussion" <ontac-forum@xxxxxxxxxxxxxx>
Cc: "Lenat, Doug" <doug@xxxxxxx>
Sent: Sunday, May 21, 2006 8:55 PM
Subject: [ontac-forum] What should be in an upper-level ontology (021)
> Folks,
>
> I have been making some strong criticisms about upper-level
> ontologies, even though I have been working on and writing
> about such things for many years.
>
> I haven't completely given up on the idea that an upper level
> is useful, but over the years, I have come to the conclusion
> that any upper level should be as *neutral* as possible --
> i.e., it should not be biased for or against anybody's pet
> theories, and it should not contain any axioms that contradict
> any of the specialized axioms that anybody might need in any
> application at any lower level.
>
> These criteria imply that the upper levels should have very
> few axioms. Following are some kinds of axioms that should
> *not* be in the upper levels:
>
> 1. Any axioms that make empirical claims that might be
> falsified by future experiments or any claims that are
> known to be false in detail, but which may be useful
> approximations for many purposes. For example, the upper
> levels should be neutral with respect to a Newtonian view
> vs. any more modern theory of physics because for many
> practical purposes a Newtonian description is accurate
> within the granularity of the usual measuring instruments.
>
> 2. Any axioms that require, prefer, or rule out one kind of
> representation over another, such as a four dimensional
> vs. a (3+1) dimensional description of space and time.
>
> 3. Any axioms that rule out exceptional cases that may be rare,
> but possible. For example, it should not say that a tiger
> has four legs, because some tigers might be born with more
> than four and some might lose a leg. In fact, there might
> be quadriplegic tigers that get around in some prosthetic
> device.
>
> 4. Any axioms that imply a vase and the clay it consists of
> are or are not identical, because many respectable theories
> make different claims in that regard. They should also
> avoid all claims about whether a child is identical or not
> identical to the adult at some later stage of life -- because
> some theories say yes, others say no, and other treat the
> question as context-dependent (i.e., identical for inheritance
> issues, but not identical for employment). In fact, the
> entire issue of identity claims is so full of conflicting
> philosophical positions that the upper levels should *not*
> make any identity claims of any kind.
>
> 5. Any axioms that imply physical objects and processes are
> disjoint. Some theories say they must be disjoint, others
> say they may overlap, and others say that object and process
> descriptions are complementary ways of describing the same
> phenomena.
>
> 6. Any axioms about artifacts that may be falsified by developments
> in technology. For example, the attached phone.gif example is
> taken from a dictionary published in 1969, but very few of the
> features depicted are common in the telephones manufactured
> today. However, the definition in that dictionary is still
> true: "an instrument for reproducing sounds at a distance."
>
> 7. Any axioms that distinguish essential properties from accidental
> properties. This issue has been debated since the time of Plato
> and Aristotle. The traditional definition of Human is Rational
> Animal, and the ability to laugh was considered an accidental
> property. However, many philosophers have claimed that the
> ability to laugh is just as characteristic of humans and more
> easily defined than the ability to reason. Today, genes are
> considered more fundamental to what is "essential", but that
> makes it harder to distinguish humans from chimps and bonobos.
> A truly neutral upper level should avoid any commitment to what
> is considered essential vs. what is considered accidental.
>
> When you start to analyze the issue, the number of possible conflicts
> becomes so large, that the safest position with regard to any axiom
> in the upper levels is very short: When in doubt, leave it out.
>
> That definition of telephone from 1969, which is still true today,
> suggests the kind of information that might be included in an upper
> level: a definition or axiom that is true because of the intended
> function of an artifact. Unfortunately, it is very difficult to
> state such definitions in full generality and even more difficult
> to find formally defined relations that can be used to state them.
>
> For example, how can the words "instrument", "reproduce", "sound",
> and "distance" be defined in a precise, but general way? And how
> could they be defined in a way that would distinguish a telephone
> from a radio? And would you want to distinguish telephones from
> radios? If you did, that would rule out cell phones, which are
> in fact radios and can be used to reproduce radio programs and
> even television programs. In fact, cell phones are now being used
> as cameras and even TV cameras that broadcast live events over the
> Internet.
>
> If you want the upper levels to be sufficiently general, you can't
> put many axioms into them. They become, in fact, just what I said
> in my previous notes: a cleaned-up version of WordNet. It's not
> an accident that WordNet is so widely used, because it performs a
> needed function. Anything that is more detailed than WordNet would
> be too constrained and too inflexible to be useful for relating
> one general-purpose ontology to another.
>
> For deduction, however, inflexibility and constraints are necessary
> to support detailed proofs. But those constraints will inevitably
> create contradictions with very important developments in technology.
> If you define the word "telephone" in such a way that makes it disjoint
> with radios, TVs, and cameras, then you rule out cell phones -- or
> perhaps you may permit very simple cell phones, but you rule out any
> kind of new technology without making a major category shift.
>
> Summary: When you examine all the conditions and applications that
> an upper-level ontology must serve, you discover that it is very hard
> to distinguish it from a cleaned up terminology. It is clear that we
> need detailed axioms in the microtheories, but it is not at all clear
> whether we need an upper level that is distinct from a terminology.
>
> John Sowa
>
> (022)
-------------------------------------------------------------------------------- (023)
>
> _________________________________________________________________
> Message Archives: http://colab.cim3.net/forum/ontac-forum/
> To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
> Subscribe/Unsubscribe/Config:
> http://colab.cim3.net/mailman/listinfo/ontac-forum/
> Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
> Community Wiki:
> http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG
> (024)
_________________________________________________________________
Message Archives: http://colab.cim3.net/forum/ontac-forum/
To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
Subscribe/Unsubscribe/Config:
http://colab.cim3.net/mailman/listinfo/ontac-forum/
Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
Community Wiki:
http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG (025)
|