ontac-forum
[Top] [All Lists]

Re: [ontac-forum] Re: owl:Class and owl:Thing

To: "ONTAC-WG General Discussion" <ontac-forum@xxxxxxxxxxxxxx>
Cc: semantic-web@xxxxxx, Hans Teijgeler <hans.teijgeler@xxxxxxxxxxx>, Frank Manola <fmanola@xxxxxxx>, seanb@xxxxxxxxxxxx, "Peter F. Patel-Schneider" <pfps@xxxxxxxxxxxxxxxxxxxxxx>, "Paul Prueitt (ontologystream)" <psp@xxxxxxxxxxxxxxxxxx>
From: "Danny Ayers" <danny.ayers@xxxxxxxxx>
Date: Thu, 6 Apr 2006 23:49:41 +0200
Message-id: <1f2ed5cd0604061449i1979c118h73ad8f31e0b3a06f@xxxxxxxxxxxxxx>
Hey Azamat,    (01)

I have every intention of reading your post in its entirety, but in
lieu of that, what Hans said: you must be a hell of a typist!    (02)

There is a question I'm curious about. I believe your main point is
that without a firm foundation in reality modelling, ontological
activity will be lacking. The reality modelling you describe includes
physics and other conceptual frameworks that are, er, grounded in
reality. But what of reasoning about things which are not conveniently
addressed through models of reality?    (03)

I have a concept of a poem, and on reading a given poem I may have
(highly subjective) interpretations of what the poet intends when he
talks of a "host of golden daffodils". Me, I wonder whether he means
the big cultured things, or the little wild ones. I suspect the
latter, and as I've visited the geographic areas which inspired said
poet I assume I should remember whether there were little wild ones
there or not. But I honestly can't remember anything much but a leaky
tent. No angels.    (04)

My point is this: is knowledge representation just about things that
can being related to reality, or is it about what the humble human
considers as knowledge? (I bet there's 3000 years of the literature
could be pointed to here, but I'd rather ask here & now ;-)    (05)

In pragmatic terms it certainly does seem to be most immediately
productive to use software to deal with reality-based problems, and in
general we seem fairly well equipped to express these in a
mathematical form that fits with the machines. But isn't it implicit
in the upper ontology approach that it will exclude conceptual
structures that may not fit with any consensus view of reality?    (06)

The domain-independence of languages like RDF/OWL could be applied to
many realities, without any need to commit to any particular model.
Isn't that an advantage?    (07)

Ok, capture this post as "whim" instantiation-of "fleeting thought"...    (08)

Cheers,
Danny.    (09)




On 3/31/06, Azamat <abdoul@xxxxxxxxxxxxxx> wrote:
> Hans inquired:
> ''Is it possible that owl:Individual, that once existed [1], was meant to be
> the class of REAL individuals in a REAL world?''
>  Hans decided:
> ''I have thrown out the owl:Thing. Much easier to read for humans.''
>
>
> The class/thing distinction makes here all the difference, and hardly you
> will get any explicit account from the owl languages authors. For its a
> central issue in all current activities of building top ontologies (SUO,
> USECS, ONTAC, etc.) and SW languages (RDFS, OWL, OWL1.1, etc), and it
> touches the sorest spot in the whole logical enterprise of OWL ontology
> passing as an ontological undertaking 'breaking all implicit and explicit
> assumptions of computing science'.
>
> The status, validity, and expressivity of any general representational
> languages and technologies are chiefly determined by the ways of treating
> the things in the world. And there are usually three main choices widely
> practiced: one can define 'Thing' as an individual,  a class of individuals,
> or the universal class, i.e., the class of all classes. Or, in terms of
> quantities, as a fixed value (constant), an individual variable, and a class
> variable.
> The narrow view of thing as [an entity with a specific identity] has its
> long history as ('a primary substance', 'a bare individual', etc.) and was
> supported by such modern logicians and ontologists as Quine, for whom 'to be
> is to be a value of a bounded variable'.
> In the OWL domain, the extension of the construct owl:Thing has only
> individual things, being void of other essential meaningful dimensions. In
> the biological classificatory system, this corresponds to the level of
> species whose members share a set of essential features and bound by a
> membership relationship between an individual and its class. Note you can
> subject a collection of individuals, say, the totality of human beings, to
> further divisions and subdivisions, such as man and woman, White or Black or
> Yellow or Red, the aged or the young, the poor or the rich, the working
> class or the professional class; underworld, lower class, middle class or
> higher class, etc. Yet they are not (genetically) essential classifications,
> and you are still in the domain of individuals, for even infinitely
> increasing the number of individuals doesn't allow you to create  a new
> class or species or kind. Therefore we say about two types of difference, in
> kind or in degree.
> But a fundamental position is  to consider Thing (or Entity) as the class of
> classes (the set of subsets) at least; at best as the class of all classes
> (the universal set of all sets), hierarchically ordered by inclusion
> (containment) relationships (or whole-part relationships). Since, as the
> class variable, Thing will have as its values lower classes and subclasses
> as well, or the type of variables whose values are also variables (as a
> metasyntactic variable 'foobar', where "the value of f(foo, bar) is the sum
> of foo and bar").
>
> Returning to our sheep, the OWL semantic language. To be blunt, without
> diplomatic evasion and sublety, as a general ontological language it is
> fundamentally defective and it would be a technological catastrophe to use
> this as 'Ontology Infrastructure for the Semantic Web' [1] for several
> evident reasons.
>
> First, the polar terms of the OWL vocabulary are individuals, classes, and
> properties, which are, above all, mathematical and logical abstract terms
> without real content and substance, i.e., without reference to reality. To
> be an ontology, its basic construct should be the class of Thing equal to
> the class of all entity classes, of which the most fundamental are the class
> of Substance (Object), the class of State (Quantity and Quality), the class
> of Process (Change or Action) and the class of Relationship. Each one of
> these Entity classes is organized as a hierarchy of subordinate classes
> (kinds and types), where particular levels occupied by such individual
> things (or instances, particulars, and concrete entities) as objects,
> specific states, unique events and specific connections. Crucially,
> 'definition', 'class', 'property' and 'statement' (see Topics) should be
> filled up with real contents and meanings. Even you may have an
> idiosyncratic set of ontological commitments as pivotal environmental and
> cognitive universals, still they  must be ontological classes, rather than
> logical entities.
>
> Second, the construct of owl:Property, with its two basic types: owl:
> ObjectProperty (mapping individuals to individuals) and owl:DatatypeProperty
> (mapping individuals to datatype values). In fact, there are monadic and
> diadic properties; essential and accidental; atomic, transient, complex, or
> emergent; particular and general, etc. But mostly important to tell the
> formal properties (attributes) from the ontological properties, which are
> generally classified as:
> 1. the property of being a substance (object), substantial properties;
> 2. the property of being a state (quantity or quality), quantitative and
> qualitative properties;
> 3. the property of being a process (change, action, operation), dynamic,
> functional, operational properties;
> 4. the property of being a relationship; relational properties per se.
>
> Thus, in the owl domain, owl:Property is badly narrowed to the property of
> being a formal (functional) relationship, direct and inverse; without
> explicitly identifying the nature of relations between the connected
> components, spatial, temporal, causal, whole/part, syntactic, semantic,
> pragmatic, etc. Moreover dealing with only two main types of property: owl:
> ObjectProperty and owl:DataProperty, existing as disjoint constructions,
> discard any hope of comensurability between magnitudes (entity variables)
> and multitudes (numbers), forget measurement, assigning number to things.
> There are other defects and contradictories, particularly in its
> (subsumption) logic, which may take more time and patience, so i better stop
> for now.
>
> Moral. In difference to the OWL people's feelings and hopes, it is not an
> ontology but a sort of formal language involving a functional, formal logic,
> and just need be properly renamed as FoLWL or LWL, Logical Web Language.
> Accordingly, the semantic web into the formal semantic web, which is a poor
> abstraction of the real (semiotic) Web [as it' has recently turned out],
> asking for a firm conceptual foundation, n-relational ontology of things and
> its complement, ontological semiotics. Or, put away for a long time your
> lofty hopes about real-life knowledge applications and web-based intelligent
> systems capable to represent and reason about the world, and have instead a
> 'wonderweb' blown off billions and billions of public funds. It seems
> something must be done to stop this fast-going and widely spreading pandemic
> of nescience.
>
> Hans, about you specific problem, you are on the right track. On the
> ontological abstract level, a pump is a specific class (species) of Thing [>
> substance > physical substance > artefact > device > mechanism > mechanical
> device] marked by a specific [functional property] of moving fluid and gas
> [substance] by suction or pressure [process]. This is all about its
> intensional meaning, its primary definition, while its extension is made up
> of all types of pumps differered by the type of working substance used and
> ways of operations, constructions, etc.: gas pump, oli pump, water pump,
> lift pump, hydraulic pump, hand pump, foot pump, you may continue such a
> division at infinitum. In the actual world of particular things, a pump is
> an individual existing as a concrete physical object, a unique instance of a
> class of physical devices.
> All the confusion comes from the replacement of fundamental ontological
> category of Thing or Entity with a empty logical  category owl:Class. And
> please don't throw 'things' away, as the child from the bath, rather discard
> empty 'classes', the bath itself.
>
> with all respects,
>
> Azamat Abdoullaev
>
> ----- Original Message -----
> From: "Hans Teijgeler" <hans.teijgeler@xxxxxxxxxxx>
> To: "'Dave Reynolds'" <der@xxxxxxxxxxxxxxx>
> Cc: "'SW-forum'" <semantic-web@xxxxxx>
> Sent: Friday, March 31, 2006 10:27 AM
> Subject: RE: owl:Class and owl:Thing
>
>
> > The class Pump is such a case where it is both an owl:Class and an
> > individual, as a member of the class ClassOfInanimatePhysicalObject. Yet
> > it has not been declared as owl:Thing. I understand from you that that is
> > OK.
> >
> > Is it possible that owl:Individual, that once existed [1], was meant to be
> > the class of REAL individuals in a REAL world?
> >
> > Regards,
> > Hans
> >
> > [1] http://wonderweb.semanticweb.org/deliverables/documents/D1.pdf
> >
> > =========================================================================
> >
> > -----Original Message-----
> > From: Dave Reynolds [mailto:der@xxxxxxxxxxxxxxx]
> > Sent: Thursday, March 30, 2006 23:58
> > To: Hans Teijgeler
> > Cc: SW-forum
> > Subject: Re: owl:Class and owl:Thing
> >
> > Hans Teijgeler wrote:
> >
> >> In OWL-Full it is possible to have a class that also is an individual
> >> in the context of a class-of-class. We have that a lot. Now my
> >> question is whether or not I shall call the same object an owl:Class
> >> when it is in the role of class, and call it an owl:Thing when it is
> >> in the role of individual. If not, what shall prevail? Or must I declare
> > it twice?
> >
> > You don't *need* to declare it at all in OWL/full.
> >
> > If you use a resource in the role of a class then it can be inferred to be
> > a
> > class. For example, if you use it as the object of an rdf:type statement
> > or
> > in an rdfs:subClassOf statement then it can be inferred to be an
> > rdfs:Class.
> > In OWL/full rdfs:Class and owl:Class have the same extension.
> >
> > Similarly it can be inferred to be an owl:Thing (for trivial reasons in
> > OWL/full) and probably some subclass of owl:Thing based on the
> > domain/range
> > of whatever properties you apply to it.
> >
> > However, it may be useful for human readers of your ontology if you
> > document
> > it's dual nature by declaring both it's types explicitly along with
> > appropriate rdfs:comments.
> >
> > Dave
> >
> > --
> > No virus found in this incoming message.
> > Checked by AVG Free Edition.
> > Version: 7.1.385 / Virus Database: 268.3.3/296 - Release Date: 29-Mar-06
> >
> >
> > --
> > No virus found in this outgoing message.
> > Checked by AVG Free Edition.
> > Version: 7.1.385 / Virus Database: 268.3.3/298 - Release Date: 30-Mar-06
> >
> >
> >
>
>
> _________________________________________________________________
> Message Archives: http://colab.cim3.net/forum/ontac-forum/
> To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
> Subscribe/Unsubscribe/Config: 
>http://colab.cim3.net/mailman/listinfo/ontac-forum/
> Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
> Community Wiki: 
>http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG
>    (010)


--    (011)

http://dannyayers.com    (012)

_________________________________________________________________
Message Archives: http://colab.cim3.net/forum/ontac-forum/
To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
Subscribe/Unsubscribe/Config: 
http://colab.cim3.net/mailman/listinfo/ontac-forum/
Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
Community Wiki: 
http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG    (013)
<Prev in Thread] Current Thread [Next in Thread>