The private letter regarding the Cubicon architecture makes reference to Transactional memory http://en.wikipedia.org/wiki/Transactional_memory The wiki link develops the issues well. I will not edit this page because the second school position would be not appreciated there for very long and the work will be taken off. The second school position is not merely about the inappropriateness of the core assertions made by most schools of artificial intelligence. There is another issue that is deeper, and this goes to the differences between a particular instance of “something” and sets of universals. The particular is some type of localized coherence, where the concept of coherence is not so easy to nail down. I look to the work (social biology) by Maturana to help, but also to the works by Pribram (cognitive neuroscience). The second school's position is that there are no single set of universals that can express any specific particular completely. There is perhaps a process that creates sets of universals that are able to describe functional specification that can be used to support complicated computational transactions such as occur in a service oriented computing environment. Klausner envisions this process in his work. (http://www.coretalk.net/). Additional logical constrains can be imposed within the CoreTalk “back-plate” using a derivative of the Soviet era applied semiotic theory call Quasi Axiomatic Theory. The core of the effort in SOA specifications has focused on this task, but without understanding fully the relationship that must exist between real world phenomenon and the well-specified functional relationships created in SOA standards activities. The use of web ontology and data specification models is part of the large-scale activities related to SOA transaction specification. The induction of structure occurs outside the specifics of particular events, and works in cases where it works. But there is a risk, and inevitability, that a un-anticipated failure of the transactions that will occur when a misalignment between the particular event structure and the well-specified SOA transaction model. When a well-specified SOA transaction model is being used in event management there is the risk that the model will diverge from the intentions of users without any notice that there is a problem. This risk is higher when there is significant novelty or there is a crisis. The induction of universals from particulars is hypothesized in various advanced thinking such as one sees in Tibetan Buddhism. Various Western traditions also hypothesis about induction and abduction, but there is deep confusion by the philosophical position that induction and abduction are the same as what one would think a computing device does when there is a deductive step in the logic. This position is widely held even by individuals that do not know the classical meanings for terms like “induction” and “deduction” and have not thought though the underlying issues. This is due to very strong social viewpoints, which we call “first school”. The works by Robert Rosen goes directly to this issue, in the most advanced and most correct fashion. The term "Rosen complexity" is used to differentiate any natural system from any conceptual system. The computing paradigms are all (so far) based on a finite state machine concept. The finite state machine is specifically a kind of inducted abstraction, and thus not capable of being Rosen-complex. According to the second school viewpoint, any computing system is not Rosen-complex. In theory, Rosen-complexity can occur in hybrid computing/natural-systems if the architecture is asking for human choices at specific times. The CoreTalk architecture does this through a sophisticated interaction within delineated communities using agreements and contracts. The induction/abduction of a computing architecture like those discussed at the wiki page given above, may produce various kinds of finite state machines. The specific problem that transactional memory specification addresses has to do with the problem of concurrency. The second school regards this concurrency problem to be handled by computing devices in a way that is un-natural when compared with natural processes involving particulars. One core reason is that the natural process that produces the particular instance of “something” involves both locality and non-locality. The back plate concept addresses this issue at several levels. First, the concept of generative seeds that are morphed, sent and grown by a substructural framework is revealed as a compression and encryption paradigm. Here the “particulars” are video and audio and text files that disappear into the back plate and reappear somewhere else, with a direct analog to Bell’s inequality in quantum mechanics. No one is trying to be fancy here, we are just pointing out that non-locality is handled in the real physical world in a way that will eliminate the need to transmit the data now being transmitted in the communication grids (all forms of digital data transfer). A additional economic reality is also revealed. The instrumentation of the seeds (called generative encapsulated digital objects, or gEDOs) consistent with Brad Cox’s notion of SuperDistribution creates a 100% secure protection for intellectual property released as gEDOs. Many more features not expected by the markets are possible with a system like CoreTalk. I am happen to discuss this with any one. psp@xxxxxxxxxxxxxxxxxx Prueitt, P. (1997b). Grounding Applied Semiotics in Neuropsychology and Open Logic, in IEEE Systems Man and Cybernetics Oct. 1997. Prueitt, P. (1998). An Interpretation of the Logic of J. S. Mill, in IEEE Joint Conference on the Science and Technology of Intelligent Systems, Sept. 1998.
|