I thought it was time I tried to offend you both ;), with my usual
definition of ontology: (01)
An ontology defines the terms used to describe and represent an area of
knowledge (subject matter). An ontology also is the model (set of
concepts) for the meaning of those terms. An ontology thus defines the
vocabulary and the meaning of that vocabulary. (02)
And here is the rest of my usual blurb introducing ontologies: (03)
Ontologies are used by people, databases, and applications that need to
share domain information (Domain: a specific subject area or area of
knowledge, like medicine, tool manufacturing, real estate, automobile
repair, financial management, etc.) (04)
Ontologies include computer-usable definitions of basic concepts in the
domain and the relationships among them. They encode domain knowledge
(modular knowledge), knowledge that spans domains (composable
knowlege), and they make knowledge available (reusable knowledge). (05)
The term ontology has been used to describe models with different
degrees of structure (Ontology Spectrum):
* Less structure: Taxonomies (Convera taxonomies, Yahoo
hierarchy, biological taxonomy, UNSPSC), Database Schemas (many) and
metadata schemes (ICMWG, ebXML, WSDL)
* More Structure: Thesauri (WordNet, CALL, DTIC), Conceptual
Models (OO models, UML)
* Most Structure: Logical Theories (Ontolingua, TOVE, CYC, DOLCE,
SUMO, Semantic Web) (06)
Ontologies are usually expressed in a logic-based language enabling
detailed, sound, meaningful distinctions to be made among the classes,
properties, and relations. One goal is to ensure expressive meaning but
maintain "computability", i.e., tractability of the reasoning using the
ontology. Using ontologies, tomorrow's applications can be more
"intelligent" by enabling increased interaction at the human conceptual
level. (07)
Ontologies are usually developed using special tools that can model
rich semantics. Ontologies are typically developed by a team with
individuals of two types: Domain Experts, who have the knowledge of a
specfic domain; and Ontologists who know how to formally model domains,
ontologies that domains, and possibly upper ontologies. (08)
On-going research investigates the semi-automation of ontology
development, but realistically the state-of-art for the foreseeable
future will be semi-automated. Because humans have rich semantic models
and understanding, as opposed to the impoverished models and
understanding of machines to this point, and because we want our
machines to interact more closely at the human conceptual level, we
develop ontologies, but it is still a very human-intensive effort,
especially if logical precision is required. (09)
The more and richer the knowledge sources developed and used, the
easier automation gets (via bootstrapping, learning). In addition,
rigorous and principled ontology development methodologies are evolving
(e.g., Methontology/OntoClean) and being incorporated in tools that
apply formal ontology analysis techniques to assist KR-naïve domain
experts in building ontologies. (010)
_____________________________________________
Dr. Leo Obrst The MITRE Corporation, Information Semantics
lobrst@xxxxxxxxx Center for Innovative Computing & Informatics
Voice: 703-983-6770 7515 Colshire Drive, M/S H305
Fax: 703-983-1379 McLean, VA 22102-7508, USA (011)
-----Original Message-----
From: ontac-dev-bounces@xxxxxxxxxxxxxx
[mailto:ontac-dev-bounces@xxxxxxxxxxxxxx] On Behalf Of Chris Menzel
Sent: Thursday, January 19, 2006 10:24 PM
To: John F. Sowa
Cc: ONTAC Taxonomy-Ontology Development Discussion
Subject: Re: [ontac-dev] What is "An Ontology"? (012)
On Thu, Jan 19, 2006 at 07:08:20PM -0500, John Sowa wrote:
> Some points that are still debatable or clarifiable:
>
> CM>>> I prefer a nice, simple definition:
> >>>
> >>> An ontology is a set of sentences in a formal language.
>
> JS>> Yes, but. That says what it is, but it doesn't explain why
anybody
> >> would want one or what they'd do with it. I would therefore add
the
> >> following clause:
> >>
> >> "that is designed to characterize the entities of interest in some
> >> domain for the purpose of representing, storing, and communicating
> >> information about them and performing deductions and computations
> >> with that information."
>
> CM> I'm afraid I have to disagree if you want to include this in the
> > *definition* of an ontology, as it turns a notion that is clear and
> > precise into one that is fuzzy and indeterminate. On my proposed
> > definition, there is always a definite answer, at least in
principle,
> > to the question: Is this an ontology?
>
> The answer may be definite, but it is very far from what people
> intend when they talk about ontology. Since arithmetic is formal,
> your definition would imply that "2+2=4" would be an ontology. (013)
Correct. A trivial and largely useless one. So what? Lots of useful
notions -- theory, in particular -- have tons of useless/trivial/silly
instances. (014)
> I admit that my suggested clause is too long. Therefore, I'd
> shorten it to just one line:
>
> "that is designed to characterize the entities of some domain."
>
> As soon as we mention "design", we bring in purpose. But that is
> precisely what distinguishes an ontology from an arbitrary set of
> axioms: somebody has chosen those axioms for that purpose. (015)
Well, you say tomAto, John, I say tomAHto. But you *should* say
tomAHto. ;-) I guess one reason I'm pushing the simple definition is
that there will then be no divide between the theoretical and the
practical notions. When we are working on the formal foundations of
ontology, an ontology can't be anything more than a set of sentences.
Why throw messy intentional stuff into the very definition in applied
contexts? You can still say everything you want without doing that:
ontologies are just sets of sentences, but the ontologies that are
*useful* are those that were designed with some purpose in mind. Why
take the useful but *essentially* informal notion of purpose and force
it into the nice, clean, functional notion of ontology I'm proposing? (016)
> I realize that logicians shrink with horror when somebody mentions
> the word "purpose", but for an engineer, that is the whole point: (017)
Oh, nonsense -- intensional logic (with an "s") is in large measure
exactly the logic of intentionality (with a "t"). That hardly came
about by logicians shrinking from notions of purpose and like. The
only
thing logicians shrink from is the idea the idea of using inherently
informal intentional notions in *definitions* -- and that is no less
true for engineers. Tell me one useful engineering TOOL that involves
intenionality. (018)
> "Engineering is an application of science for the purpose
> of solving a problem within the limits of budgets, resources,
> and deadlines." (019)
And that goes for ontological engineering as much as anything. But,
once again, that has nothing to do with the presence of intentional
notions in the mathematical foundations of the discipline. (020)
> The ONTAC group does not want a treatise on ontology. They want
> an engineering product: *an* ontology they can actually use. (021)
But good engineering products rest on sound foundations. All I'm
proposing is that we develop the same sort of clean, clear formal
foundation for ontological engineering that Newtonian mechanics
provides
for mechanical engineering. (022)
> CM> ... but theories are deductively closed on this approach,
> > and I don't think we should identify ontologies with theories
> > in that sense, as, for one thing, you can't distinguish between
> > equivalent ontologies that use different axioms.
>
> I agree that what people are asking for is a set of axioms, not
> the deductive closure. And I admit that proving equivalence
> of two different axiomatizations can be a nontrivial task. (023)
Indeed, undecidable. (024)
> But for most purposes, two different axiomatizations of the same
> theory should be considered *the same* ontology. (025)
I beg to differ. Your proposal is a recipe for confusion. Sometimes,
according to you, ontologies are identified by their axioms, sometimes
by their consequences. When? How often? Come on, John, this is a
no-brainer; there is a precise answer here: Ontologies are sets of
sentences. Period. Ontologies are *equivalent* if their deductive
closures are identical. Clean, clear, simple. (026)
> If you translate an ontology from one CL dialect to another
> and then back to the first, you're likely to get a different,
> but logically equivalent set of statements. (027)
That absolutely should *not* happen (unless you are talking about
structurally distinct dialects), but even if it did, so what? (028)
> Furthermore, people will often transform a set of axioms in order
> to optimize them for their particular theorem prover, and they want
> to claim that they're using "the same" ontology. (029)
Sure, *now*, when there is still precious little clarity and agreement
about the foundations of ontology -- I mean, you guys can't even get
"type" and "class" figured out! :-) But when ontological engineers
start receive training as rigorous as that of mechanical engineers,
especially in logic, the distinction between identity and equivalence
will be as natural as breathing. (030)
Re lattices: (031)
> But that is a topic for another note. (032)
Right -- I was just sort of thinking out loud. The idea of a lattice
of
theories is certainly not confused or pernicious. (033)
-chris (034)
_________________________________________________________________
Message Archives: http://colab.cim3.net/forum/ontac-dev/
To Post: mailto:ontac-dev@xxxxxxxxxxxxxx
Subscribe/Unsubscribe/Config:
http://colab.cim3.net/mailman/listinfo/ontac-dev/
Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
Community Wiki:
http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatin
gWG (035)
_________________________________________________________________
Message Archives: http://colab.cim3.net/forum/ontac-dev/
To Post: mailto:ontac-dev@xxxxxxxxxxxxxx
Subscribe/Unsubscribe/Config: http://colab.cim3.net/mailman/listinfo/ontac-dev/
Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
Community Wiki:
http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG (036)
|