ontac-forum
[Top] [All Lists]

RE: [ontac-forum] Follow up question on Ontology, knowledge, language co

To: ONTAC-WG General Discussion <ontac-forum@xxxxxxxxxxxxxx>
Cc: "ONTAC-WG General Discussion" <ontac-forum@xxxxxxxxxxxxxx>, ontac-forum-bounces@xxxxxxxxxxxxxx
From: dbedford@xxxxxxxxxxxxx
Date: Fri, 7 Oct 2005 09:23:11 -0400
Message-id: <OF7210B052.26AF946F-ON85257093.0047E7CE-85257093.004988BF@xxxxxxxxxxxxx>
All,    (01)

I was not able to participate in your session on Wednesday because I was giving
an all day workshop on search systems.  I must say, though, that I'm a bit
concerned that the group seems to be starting a discussion at what I would call
level 0 rather than to be sharing practical experience from organizations which
might be at level 2 or 3.  I would like to suggest that this group:    (02)

(1) challenge/test the basic assumption that the current ontology model is
sufficiently well developed and tested to use as a baseline for any real
practical development -- I think there are several flaws in the model;
(2) test that ontology model for scalability and extensibility (I don't think it
will pass this test);
(3) clearly state what the end game of an ontology is (there are of course
multiple end games, but we need to begin to at least define what some of them
are - otherwise they are motherhood, apple pie and everything in between);
(4) distinguish between tools and techniques that can be used to build an
ontology, and begin to identify where these tools and techniques are best used
in the development of a robust ontology model;
and (5) review what work has already been done that is not labelled 'ontology'
per se but actually does move towards the end game.    (03)

For several years now we have been talking at the word level -- way too much
progress has been made beyond this level for us to still be "struggling" with
these issues.  We cannot afford to still be talking at this level.   And,
frankly, the structures for dealing with these issues are already available to
us and in some cases are in place.    (04)

Please forgive my frankness but there's a lot of practical experience around in
solving these kinds of problems -- we need to leverage that experience, not try
to reinvent it.    (05)

Best regards,
Denise    (06)




             "Cassidy,                                                          
             Patrick J."                                                        
             <pcassidy@mitre                                                 To 
             .org>                   "ONTAC-WG General Discussion"              
             Sent by:                <ontac-forum@xxxxxxxxxxxxxx>               
             ontac-forum-bou                                                 cc 
             nces@xxxxxxxxxx                                                    
             .net                                                       Subject 
                                     RE: [ontac-forum] Follow up question on    
                                     Ontology, knowledge,     language          
             10/07/2005              confusion                                  
             08:15 AM                                                               (07)


             Please respond                                                     
                   to                                                           
                ONTAC-WG                                                        
                 General                                                        
               Discussion                                                       
             <ontac-forum@co                                                    
              lab.cim3.net>                                                         (08)






Gary,
   Concerning your question:
>>>  The thrust of my issue is that while a completed ontology might avoid the
issues of language and knowledge, the process of developing an ontology runs
into both of these.  Our expertise has knowledge problems and our discussion of
different concepts to merge uses language to communicate our ideas on this and
so the resulting ontology product may reflect these problems....Of course we may
have methods to resolve these, but shouldn't think that an automatic process.
Ontologies aren't built by tools, but by us using tools.    (09)

Yes, in developing ontologies, whether alone, or collaboratively, we do rely on
our language to describe the meanings that we want to associate with concepts.
But when we are trying to create unambiguous definitions, we tend to use only
that portion of language that is itself fairly unambiguous, and in which the
words are understood as labels for the corresponding concepts that are widely
shared and agreed on -- i.e. we use a fairly precise and common defining
vocabulary, when we want to be precise.  The example I provided was the
Longman's Dictionary of Contemporary English, in which the lexicographers
consciously restricted themselves to a fairly small set of about 2000 basic
words (pointers to concepts) to define all of the 56,000 "words and phrases" in
that dictionary.  These basic concepts would have to substitute for the
knowledge people must acquire over years of living in the physical world,
bumping into things, getting hungry and eating, interacting with other people,
and learning how to create abstractions and to use the structures of common
physical things and events ("head", "drop") to label more abstract
non-observable derivative concepts . . .  they would have to substitute, *if*
they were not also defined and constrained by axioms that restrict the ways they
can relate to each other.  As Adam pointed out, when we add axioms to the
concepts in an ontology, the ambiguity is reduced, and the concepts are no
longer merely arbitrary symbols, since they are constrained to interact with the
other symbols in well-defined ways.  And as Michael Gruninger pointed out, the
ideal toward which we should try to develop our ontologies is the case where the
axiomatization is sufficiently rich that the number of models that the
definitions are consistent with is only and precisely those models that we
intend to describe.  In this way those seemingly abstract mathematical
structures in the computer will have a structure and behavior closely matching
the structure and behavior of those real-world objects that they are intended to
represent, and the computer should be able to do reliable inferencing about the
real world by manipulating those symbols. We do start with language, but by
using the most fundamental and precise and broadly agreed-on defining words, we
can create a comparably precise conceptual defining vocabulary for the computer,
and build up more complex concepts in the computer that do not have the
ambiguity of natural-language words which may be used in different contexts to
label multiple complex aggregates of concepts.    (010)

I would in fact recommend for the ONTACWG that we adopt an English-language
"defining vocabulary" similar to that used in LDOCE, but for which each word
labels one concept in the existing ontology -- and use only that defining
vocabulary to create the English-language definitions of the concepts we include
in the ontology.  Then, when we find that the existing English-language defining
vocabulary does not have the words we need to describe some new concept that we
want to add, we will have a hint that there is some fundamental concept missing
which should be added to the conceptual defining vocabulary.    (011)

The is another related issue that arises in considering the epistemology of
computer knowledge; people wonder how a computer can get a firm "grounding" in
reality.  In the simplest case, a computer may be restricted to doing in-memory
processing of data structures, and the meanings of the data structures would
rely totally on what the programmer intends them to mean, the computer would
have no independent way to check.  But the computer is not totally without
connections to the real world.  It has disks, keyboards, and other input/output
devices, with which it could "experiment" and get feedback to verify that there
really is some "real world" out there.  And when we reach the stage where the
knowledge representation is sufficiently reliable for basic conceptual
computational issues, we could fit the computer with more elaborate interactive
devices to get a more "human" feeling for the nature of physical reality.  To
some extent, robotic systems have to do that right now.  But the issues we are
dealing with in this working group don't require that level of "direct physical
knowledge" in the computer.  Doing the research to create more elaborate
representations will be, I expect, a lot more efficient after some Common
Semantic Model ("COSMO") has been widely adopted, and multiple groups can
efficiently share and reuse the results of their research because it references
a common paradigm for representing knowledge.  At that point the efficiency of
research may reach the point where the epistemological issues can be
investigated in a meaningful way.  I think the COSMO has to come first.    (012)

Pat    (013)




Patrick Cassidy
MITRE Corporation
260 Industrial Way
Eatontown, NJ 07724
Mail Stop: MNJE
Phone: 732-578-6340
Cell: 908-565-4053
Fax: 732-578-6012
Email: pcassidy@xxxxxxxxx    (014)





      From: ontac-forum-bounces@xxxxxxxxxxxxxx
      [mailto:ontac-forum-bounces@xxxxxxxxxxxxxx] On Behalf Of Gary Berg-Cross
      Sent: Thursday, October 06, 2005 11:38 AM
      To: ontac-forum@xxxxxxxxxxxxxx
      Subject: [ontac-forum] Follow up question on Ontology, knowledge,language
      confusion    (015)



      Adan Pease pointed out that we often confuse the ambiguity in language and
      the locaility of knowledge with ontology.  I wanted to ask about this
      distinction but there there wasn't time for it, but it might be worth
      discussing here so our concepts and methods are clear.    (016)

      The thrust of my issue is that while a completed ontology might avoid the
      issues of language and knowledge, the process of developing an ontology
      runs into both of these.  Our expertise has knowledge problems and our
      discussion of different concepts to merge uses language to communicate our
      ideas on this and so the resulting ontology product may reflect these
      problems....Of course we may have methods to resolve these, but shouldn't
      think that an automatic process.  Ontologies aren't built by tools, but by
      us using tools.    (017)

      Thoughts?    (018)

      Gary Berg-Cross
      EM&I
      Potomac, MD
      _________________________________________________________________
      Message Archives: http://colab.cim3.net/forum/ontac-forum/
      To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
      Subscribe/Unsubscribe/Config:
      http://colab.cim3.net/mailman/listinfo/ontac-forum/
      Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
      Community Wiki:
      http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG    (019)




_________________________________________________________________
Message Archives: http://colab.cim3.net/forum/ontac-forum/
To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
Subscribe/Unsubscribe/Config: 
http://colab.cim3.net/mailman/listinfo/ontac-forum/
Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
Community Wiki: 
http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG    (020)
<Prev in Thread] Current Thread [Next in Thread>