[Top] [All Lists]

[ontac-forum] Knowledge Sharing Foundation suggestion

To: "ONTAC-WG General Discussion" <ontac-forum@xxxxxxxxxxxxxx>
Cc: Robert O'Harrow <oharrowr@xxxxxxxxxxxx>
From: "psp" <psp@xxxxxxxxxxxxxxxxxx>
Date: Sat, 19 Nov 2005 10:01:03 -0700
Message-id: <CBEELNOPAHIKDGBGICBGEEPNGNAA.psp@xxxxxxxxxxxxxxxxxx>




In looking at the diagram "Progression of Written Information" attached to your last communication I was impressed with it.  The development of an ontology hub is suggested by the progression of elements in your diagram.  The progression starts with the development of human language.  So ontological modeling is seen as a mere extension of the phenomenon of human languages.  The computer, computation and the Internet is where this extension is occurring. 


I have formed the opinion that Tim Berners-Lee's notion of Semantic Web is a polemic that hides from us the possibility of a naturally occurring "ontologyhub".  As some evidence for this, I found that www.ontologyhub.com and www.ontologyhub.net , are both unregistered - and just registered them for use later on.  www.ontologyhub.org is registered but goes nowhere. 


As in other polemics, the products of the Semantic Web standards processes produces memetic mechanisms that divert attention to something that has hidden consequences.  The concept of "memetics" (as developed by Blackmore and Dawkins (“The Selfish Gene” and others) is itself a polemic or polemic complex.  The essential insight, that genes are not the only replicator mechanism is lost because polemics are created to avoid this insight.  The creations need not to be "conscious", so I am not talking about individual intent. 


The recent vote 302 to 3 by the House of Representatives to oppose the "immediate withdraw of US troops from Iraq" is a classical example of an intentional polemic.  Those who had to vote for this resolution had a complex argument as to why they felt insulted by the requirement to go along with something (a polemic). 


My sense is that the last hub-like element captures the essence of what will eventually be used with an ontology core.   Such a core would likely be developed by one smart individual, not a W3C standards committee, who has internalized many other attempts at ontology development and "accidentally" creates an synthesis that is so useful that that core ontology is used by everyone.  Not as something forced as in an ontology Tsar but as if simply new natural language.


As Mathew said to John (in this e-forum),


MW: I quite agree, but the death of legacy systems comes when the cumulative addition of functionality makes the system unstable, and a rewrite is easier than continuing maintenance.


John (Sowa) has communicated to me, and to others, pessimism about an impossibility imposed by a legacy that exists.  There are two parts to John's observation, which I feel he and we should take apart and deal with separately. 


The first part is factual. This software legacy exists.  The actual functionality was designed to reflect IT interests more than social interests.  (One has to pause and think about the representation I have made here.  I am saying something merely as an observation having no criticism attached.)  Starting with the operating system that has dominated the market, we have computer science that is properly created IF one wishes to endow IT service providers with an increasing control over wealth. 


The second part is judgmental.  Has the legacy infrastructure provided to society (not merely the business part of social activity, but society in the broadest sense) with optimal value.  By optimal value, I mean a value to society that is only now a distant potential that MIGHT be developed from the reality of computer-based functions.  Sandy Klausner's work, for example, would be able to deliver a maximum value to society IF it was deployed as a federal government paid for infrastructure like the highway system. 


Imagine if individuals or corporations owned all paved roads.  Imagined that the owners charged a fee and imposed a social agreement on anyone wanting to travel on that particular road.  Suppose that the social agreement was to not use roads that were free to use.  Some free roads would exists but the type of open use of transportation that we have now would not exist. 


This situation is what we have with corporate ownership and individual IT professional control over the development of code ... and specially the development of early ontology specifications.  The problem is larger than IT professionalism, of course. 


The BCNGroup's call for a White House lead national project is based on the realization that IT and business interests have gained an absolution control over something that could be much more valuable to society if standards were generated outside this control.  Here I am talking specifically about the shallowness of the W3C standards processes.


OASIS does have deeper voices.  But the concept of a standards committee has fundamental flaws, starting with the empowerment of a few individuals to create and enforce polemics.  A White House lead national project could deploy something like CoreSystem, or something that certain individuals in this eforum could produce if allowed to, and if supported with reasonable salaries.  The concept of a committee would shift to the concept of observing natural communication between real individuals and entities.  (Thus the importance to semantic extraction as a measurement of ontological reality.) 


My notes on a National Project are at




The specific plan to create a new infrastructure is at:



Message Archives: http://colab.cim3.net/forum/ontac-forum/
To Post: mailto:ontac-forum@xxxxxxxxxxxxxx
Shared Files: http://colab.cim3.net/file/work/SICoP/ontac/
Community Wiki: 
http://colab.cim3.net/cgi-bin/wiki.pl?SICoP/OntologyTaxonomyCoordinatingWG    (01)
<Prev in Thread] Current Thread [Next in Thread>