ontac-forum
[Top] [All Lists]

Re: [ontac-forum] Follow up question on Ontology, knowledge, language co

To: ONTAC-WG General Discussion <ontac-forum@xxxxxxxxxxxxxx>, "Peter P. Yim" <peter.yim@xxxxxxxx>
Cc:
From: Nicolas F Rouquette <nicolas.rouquette@xxxxxxxxxxxx>
Date: Fri, 07 Oct 2005 08:31:07 -0700
Message-id: <434694BB.1000908@xxxxxxxxxxxx>
Pat, Peter:    (01)

I apologize for my no-show yesterday. The fiscal year transition turned 
out to be far more difficult than in years past
due to reduced budget axes that have hit us pretty hard. It's going to 
take me some time to recover from these wounds.
I wholeheartedly agree w/ Adam and Mike G. statements about adding 
axioms to an ontology; in fact, I regard that
has the biggest practical challenge, not because it is something that 
amateurs can or should do but because it is something
that consciencious engineeers ought to do anyway to avoid being trumped 
or overwhelmed by amateurish zeal and agility
at producing semantic nonsense.    (02)

-- Nicolas.    (03)

Cassidy, Patrick J. wrote:    (04)

> Gary,
>    Concerning your question:
>>>>  The thrust of my issue is that while a completed ontology might 
> avoid the issues of language and knowledge, the process of developing 
> an ontology runs into both of these.  Our expertise has knowledge 
> problems and our discussion of different concepts to merge uses 
> language to communicate our ideas on this and so the resulting 
> ontology product may reflect these problems....Of course we may have 
> methods to resolve these, but shouldn't think that an automatic 
> process.  Ontologies aren't built by tools, but by us using tools.
>  
> Yes, in developing ontologies, whether alone, or collaboratively, we 
> do rely on our language to describe the meanings that we want to 
> associate with concepts.  But when we are trying to create unambiguous 
> definitions, we tend to use only that portion of language that is 
> itself fairly unambiguous, and in which the words are understood as 
> labels for the corresponding concepts that are widely shared and 
> agreed on -- i.e. we use a fairly precise and common defining 
> vocabulary, when we want to be precise.  The example I provided was 
> the Longman's Dictionary of Contemporary English, in which the 
> lexicographers consciously restricted themselves to a fairly small set 
> of about 2000 basic words (pointers to concepts) to define all of the 
> 56,000 "words and phrases" in that dictionary.  These basic concepts 
> would have to substitute for the knowledge people must acquire over 
> years of living in the physical world, bumping into things, getting 
> hungry and eating, interacting with other people, and learning how to 
> create abstractions and to use the structures of common physical 
> things and events ("head", "drop") to label more abstract 
> non-observable derivative concepts . . .  they would have to 
> substitute, *if* they were not also defined and constrained by axioms 
> that restrict the ways they can relate to each other.  As Adam pointed 
> out, when we add axioms to the concepts in an ontology, the ambiguity 
> is reduced, and the concepts are no longer merely arbitrary symbols, 
> since they are constrained to interact with the other symbols in 
> well-defined ways.  And as Michael Gruninger pointed out, the ideal 
> toward which we should try to develop our ontologies is the case where 
> the axiomatization is sufficiently rich that the number of models that 
> the definitions are consistent with is only and precisely those models 
> that we intend to describe.  In this way those seemingly abstract 
> mathematical structures in the computer will have a structure and 
> behavior closely matching the structure and behavior of those 
> real-world objects that they are intended to represent, and the 
> computer should be able to do reliable inferencing about the real 
> world by manipulating those symbols. We do start with language, but by 
> using the most fundamental and precise and broadly agreed-on defining 
> words, we can create a comparably precise conceptual defining 
> vocabulary for the computer, and build up more complex concepts in the 
> computer that do not have the ambiguity of natural-language words 
> which may be used in different contexts to label multiple complex 
> aggregates of concepts. 
>  
> I would in fact recommend for the ONTACWG that we adopt an 
> English-language "defining vocabulary" similar to that used in LDOCE, 
> but for which each word labels one concept in the existing ontology -- 
> and use only that defining vocabulary to create the English-language 
> definitions of the concepts we include in the ontology.  Then, when we 
> find that the existing English-language defining vocabulary does not 
> have the words we need to describe some new concept that we want to 
> add, we will have a hint that there is some fundamental concept 
> missing which should be added to the conceptual defining vocabulary.
>  
> The is another related issue that arises in considering the 
> epistemology of computer knowledge; people wonder how a computer can 
> get a firm "grounding" in reality.  In the simplest case, a computer 
> may be restricted to doing in-memory processing of data structures, 
> and the meanings of the data structures would rely totally on what the 
> programmer intends them to mean, the computer would have no 
> independent way to check.  But the computer is not totally without 
> connections to the real world.  It has disks, keyboards, and other 
> input/output devices, with which it could "experiment" and get 
> feedback to verify that there really is some "real world" out there.  
> And when we reach the stage where the knowledge representation is 
> sufficiently reliable for basic conceptual computational issues, we 
> could fit the computer with more elaborate interactive devices to get 
> a more "human" feeling for the nature of physical reality.  To some 
> extent, robotic systems have to do that right now.  But the issues we 
> are dealing with in this working group don't require that level of 
> "direct physical knowledge" in the computer.  Doing the research to 
> create more elaborate representations will be, I expect, a lot more 
> efficient after some Common Semantic Model ("COSMO") has been widely 
> adopted, and multiple groups can efficiently share and reuse the 
> results of their research because it references a common paradigm for 
> representing knowledge.  At that point the efficiency of research may 
> reach the point where the epistemological issues can be investigated 
> in a meaningful way.  I think the COSMO has to come first.
>  
> Pat
>  
>  
>
> Patrick Cassidy
> MITRE Corporation
> 260 Industrial Way
> Eatontown, NJ 07724
> Mail Stop: MNJE
> Phone: 732-578-6340
> Cell: 908-565-4053
> Fax: 732-578-6012
> Email: pcassidy@xxxxxxxxx
>
>  
>
>     ------------------------------------------------------------------------
>     *From:* ontac-forum-bounces@xxxxxxxxxxxxxx
>     [mailto:ontac-forum-bounces@xxxxxxxxxxxxxx] *On Behalf Of *Gary
>     Berg-Cross
>     *Sent:* Thursday, October 06, 2005 11:38 AM
>     *To:* ontac-forum@xxxxxxxxxxxxxx
>     *Subject:* [ontac-forum] Follow up question on Ontology,
>     knowledge,language confusion
>
>
>      
>     Adan Pease pointed out that we often confuse the ambiguity in
>     language and the locaility of knowledge with ontology.  I wanted
>     to ask about this distinction but there there wasn't time for it,
>     but it might be worth discussing here so our concepts and methods
>     are clear.
>      
>     The thrust of my issue is that while a completed ontology might
>     avoid the issues of language and knowledge, the process of
>     developing an ontology runs into both of these.  Our expertise has
>     knowledge problems and our discussion of different concepts to
>     merge uses language to communicate our ideas on this and so the
>     resulting ontology product may reflect these problems....Of course
>     we may have methods to resolve these, but shouldn't think that an
>     automatic process.  Ontologies aren't built by tools, but by us
>     using tools.
>      
>     Thoughts?
>      
>     Gary Berg-Cross
>     EM&I
>     Potomac, MD
>
>------------------------------------------------------------------------
>
> 
>_________________________________________________________________
>Message Archives: http://colab.cim3.net/forum/ontac-forum/
>To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
>Subscribe/Unsubscribe/Config: 
>http://colab.cim3.net/mailman/listinfo/ontac-forum/
>Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
>Community Wiki: 
>http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG
>  
>    (05)


_________________________________________________________________
Message Archives: http://colab.cim3.net/forum/ontac-forum/
To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
Subscribe/Unsubscribe/Config: 
http://colab.cim3.net/mailman/listinfo/ontac-forum/
Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
Community Wiki: 
http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG    (06)
<Prev in Thread] Current Thread [Next in Thread>