soa-forum
[Top] [All Lists]

[soa-forum] Computing and Social Architecture

To: Service-Oriented Architecture CoP <soa-forum@xxxxxxxxxxxxxx>
From: Paul Prueitt <psp@xxxxxxxxxxxxxxxxxx>
Date: Tue, 22 Apr 2008 13:19:40 -0500
Message-id: <0259D41F-E9A1-4003-A465-8310B3005382@xxxxxxxxxxxxxxxxxx>

Computing and Social Architecture

For the Knowledge Age:

A white paper on new computing architecture

Paul Prueitt

 

The linear processing model is reaching a limit imposed by physics.  This physical limitation is most acutely felt in processor design. Our work provides, for the first time, an architecture that will allow computing to jump clear of the multi-core and parallel processing verses single processor issue.  As the paradigm is adopted, it will be of no great concern where processing is occurring, how many processors and what the architecture is.  The reason why is found in the use of regularity at an atomic level, a regularity that is aggregated into molecular patterns and then expressed as process behavior. 

 

Physics and chemistry has this exact relationship, a relationship that in computing theory simplifies processing, reducing the actual loads on processes and opens the door for new computing and communication paradigms.  Physical chemistry is expressed as an aggregation of a small set of atomic types.  Biology, in turn, developed because of predictability over a class of chemical reactions, as complex functions become part of the mechanics of life. 

 

We see in this new computing architecture, the nature of all those physical processes supporting living systems.  Specifically, the sciences of physics, genetics and cognitive neuroscience reveal how the human mind works.  Our view is that the human mind does not work like a von Neumann computer, and yet the current school of information science is based on the von Neumann theory of computing.  A new school of information science is needed.  New work in social science, economics and political science reframes why society needs computing and in what ways might computing and distributed communications aid in increasing over all quality of life.  Surprisingly, the new school of thought is simpler than the current school of thought.  A corresponding simplification of computing architecture is re-expressed in a moderately complicated infrastructure, called Cubicon, having the layered composition of process behaviors from molecular structure, which are in turn compositions from a closed finite set of data footprints. 

 

In the new school of information science, called the “second school”, design specification uses the combinatorial span of molecular units.  The parallel to physical chemistry is clear.  Specification also must conform to standards selected by community-controlled genealogy.  Again, the parallel to bio-chemistry and genetics is clear, the living system controls the evolutionary drift of the system.  This computing genealogy is complex in the way that biological genealogy is complex. The actual structure of the computing genealogy itself has the form of a taxonomy developed through a peer reviewed study of best practices in software design.

 

Peer review can also be found from the natural sciences.  A parallel can be made directly to the science of genetics, and the recent history in which genetic taxonomy has been refined into the new mathematics of biology: ontological models.  Ontological models extend Hilbert mathematics to address certain non-deterministic aspects of living systems.  Many examples exist, but perhaps the best example is the work on gene and cell signal pathway ontology by the biopax organization. [1]  A parallel can also be found in a comparison of John Adams and John Nash theories of economics. 

 

Distributed self-optimized computing, by name alone, points to solutions that bypass the entrenched problems that together make up the complexity wall facing the information technology industry.  Many autonomous heterogeneous Turing machines may be connected through substructural channels and may speak the common language and protocol of molecular patterns expressed from a framework, that framework specifying a defined set of atoms.  When this is done in a transparent and dependable way, evolutionary processes can be governed by community involvement.  The outcome is a computing infrastructure capability of real time cognitive aids for interacting communities. 

 

The specification of substructural channels, and the use of generative mechanisms allow complete transparency within spheres of processing.  Many positive benefits accrue, including intellectual property management and managed compensation for design work. The generative mechanism, call a “back-plate”, is similar both to computational fractal compression as well as gene or cell signal pathway _expression_. The network becomes the virtual computer, independent of the number of cores concentrated on a particular piece of silicon.  How things are transmitted is altered so that only “generative seeds” are sent, and not the complete digital object. Things are simplified and router processing loads reduced radically.  The combined computer and human system supports collective communication, and thus social intelligence.  Everything about information technology changes. 

 

The back-plate generative mechanism is constant with how natural language is produced by kinematics of the mouth and neural structures [2] and with how cognitive context is supported by phase coherence [3] arising from interactions of neural ensembles. [4] The mechanism exchanges non-localized information in a fashion similar to mechanisms observed in quantum theory. [5]

 

Cubicon is the first known manifestation of these design principles.  As in individual neural processing, Cubicon supported computing structures come and go autonomously at rates and scales dictated by the real time demands of problems being solved. The network itself is ethereal and ever changing.  It evolves independent of the data footprint to molecular pattern association selected by community genealogy agreements.  The individual evolution of a non-localized processing system occurs in the, finite state machine, span of the atomic layer, in a similar way to how theorems are explored in the span of a small set of axioms.  This non-localization is a key part of a quantum-neurodynamical justification of the claims about potential productivity gains. 

 

The computing architecture is robust enough to bring small pipe distributed computing into existence and elegant enough to insure constant optimization of efficiency. Ad hoc computing structures come into and come out of existence, as need alone dictates.  Small pipe requirements for video and knowledge representational exchanges will shift market demands and enable a new market sector based on direct human-to-human knowledge sharing.  This possibility arises from the nature of physics in same way as alternating current power use arises from an understanding of electrical power. 

 

Emergent processing structures are in fact similar to the cognitive content experienced by a single human.  There are non-local and local aspects of the processing span.  As such, the collective computing, with humans in the loop, can be provided a memory and a set of anticipatory mechanisms using system architecture directly lifted from cognitive neuroscience.  As in the neural structures, memory mechanisms are not fully localized and rely on computations to jointly identify points of complexity seen as system degeneracy. [6] The degeneracy requires non-deterministic influence to make a decision. This influence is necessary to collapse the non-locality in precisely the same fashion as predicted by field theory, thermodynamics and quantum field theories. [7] Von Neumann architecture sees these points of system degeneracy as “halting conditions”. 

 

The Cubicon architecture facilitates distributed processing. The architecture doesn't care where one core begins and another begins.  The cores can be on the same chip, separate chips on a board, separate boards across a bus in a server, or separate computers across any grid or network communicating across any network communication protocol.  The “system” is a virtual phenomenon that arises based on the commonalities as a provision of data interoperability.  The “system” is driven by the mass intention of users, or when isolated; the intention of a single user. 

 

Knowledge operating systems arise that are individually unique and dependent only on the layered structures where consistency and completeness are enforced using the principles seen from physics and cognitive neuroscience.  Shifts in viewpoint are mediated using control theory, applied semiotics, has been defined using concepts from quantum cognitive neuroscience.  These control structures act on topic map representations of human knowledge. The system evolves to meet the demands user communities have set and the restrictions or opportunities dictated by community resources.  If your cell phone needs to be a supercomputer for few seconds, so be it. The architecturally specified computing processes goes out and negotiates the computational resources it needs to make it so. 

 

This offloading of complexity, this architecture, embodies a new paradigm of complexity handling and will herald a new era of human productivity. Because society is pushing an evolution that is only partially expressed, complexity-handling capacity will increase in steps.  Some of these steps are social and some technical. New architecture allows for the automation of levels of complexity handling only hinted at the level of the previous step. These steps; will be made as social reality shifts to use the new architecture.  


This work is by Paul Prueitt.

Keyword: Paul Prueitt


other works by Paul Prueitt:


1. Eisenfeld, J. & Prueitt, P.S. (1988.) Systemic Approach to Modeling Immune Response.  Proc. Santa Fe Institute on Theoretical Immunology. (A. Perelson, ed.) Addison-Wesley, Reading, Massachusetts.

 

2. J. Kowalski; A. Ansari; P. Prueitt; R. Dawes and G. Gross (1988.) On Synchronization and Phase Locking in Strongly Coupled Systems of Planar Rotators. Complex Systems 2, 441-462.

 

3. Levine, D. & Prueitt, P.S. (1989.) Modeling Some Effects of Frontal Lobe Damage - Novelty and Preservation, Neural Networks, 2, 103-116.

 

4. Prueitt, Paul S. & Craig, Robert M. (1991.) The Object Oriented Paradigm and Neurocomputing. Analysis of Neural Net Applications Conference, ACM Proceedings, IEEE Press.

 

5. Prueitt, Paul S. (1993.) Network Models in Behavioral and Computational  Neuroscience, invited chapter In Non-animal Models in Biomedical &  Psychological Research, Testing and Education, New Gloucester: PsyETA.

 

6. Prueitt, P.S. & Erwin, H. (1993.) 1st Appalachian Conference on Behavioral  Neurodynamics: Processing in Biological Neural Networks (Conference Report); Neurocomputing Elsevier 5 1-7.

 

7. Levine D; Parks, R.; & Prueitt, P. S. (1993.) Methodological and Theoretical Issues in Neural Network Models of Frontal Cognitive Functions. International Journal of Neuroscience 72 209-233.

 

8. Prueitt, Paul S. (1994.) System Needs, Chaos and Choice in Machine Intelligence. Chaos Theory in Psychology (A. Gilgen and F. Abrams, Eds.)  Contributions in Psychology Series. Westport, Conn.

 

9. Prueitt, Paul S. (1995a) A Theory of Process Compartments in Biological and Ecological Systems. In the Proceedings of IEEE Workshop on Architectures for Semiotic Modeling and Situation Analysis in Large Complex Systems; August 27-29, Monterey, Ca, USA; Organizers: J. Albus, A. Meystel, D. Pospelov, T. Reader

 

10. Prueitt, Paul S. (1995b) An Implementing Methodology for Computational Intelligence. In the Proceedings of First International Conference on Computational Intelligence and Neuroscience. IEEE

 

11. Prueitt, P.S.; Erwin, H. & MacLennan, B. (1995c) 3rd Appalachian Conference on Behavioral Neurodynamics: (Conference Report) Neurocomputing Elsevier . 11 : 323-328.

 

12. Prueitt, Paul S. (1996a) Optimality and Options in the Context of Behavioral Choice, in D. S. Levine & W. R. Elsberry, Eds. Optimality in Biological and Artificial Networks?, Erlbaum, 1996.

 

13. Prueitt, Paul S. (1996b). Is Computation Something New?, published in the Proceedings of NIST Conference on Intelligent Systems: A Semiotic Perspective. Session: Memory, Complexity and Control in Biological and Artificial Systems. IEEE October 20-23.

 

14. Prueitt, Paul S. (1996c). Semiotic Design for Document Understanding, in the proceedings of the Workshop on Control Mechanisms for Complex Systems: Issues of Measurement and Semiotic Analysis: 8-12 Dec. 1996.

 

15. Prueitt, Paul S. (1996d). Structural Activity Relationship analysis with application to Artificial Life Systems, presented at the QAT Teleconference, New Mexico State University and the Army Research Office, December 13, 1996.

 

16. Prueitt, P. (1997). Grounding Applied Semiotics in Neuropsychology and Open Logic, in IEEE Systems Man and Cybernetics Oct. 1997.

 

17. Prueitt, P. (1997). Quasi Axiomatic Theory, represented in the simplest form as a Voting Procedure.  VINTI, All Russian Workshop in Applied Semiotics, Moscow, Russia.  (Translated into Russian and published in VINITI Conference Proceedings.)

 

18. Prueitt, P. (1998). An Interpretation of the Logic of J. S. Mill, in IEEE Joint Conference on the Science and Technology of Intelligent Systems, Sept. 1998, NIST.

 

19. Prueitt, P. (1999a). The 4 by 4 Duplicate Document Detection (D3) formalism.  In the proceedings of the Symposium on Document Image Understanding Technology, University of Maryland Press.

 

20. Prueitt, P. (1999b). Similarity Analysis and the Mosaic Effect.  In the proceedings of the Symposium on Document Image Understanding Technology, University of Maryland Press.

 

21. Prueitt, P. (2000).  SenseMaking and Knowledge Management.  E-Gov 2000, Washington DC

 

22. Prueitt. P. (2001).  New approaches to knowledge representation. Knowledge Technologies 2001 Conference, Austin Texas

 

23. Prueitt P. (2001). Use of In-Memory Referential Information Base (I-RIB) for Data Mining.  Presentation at the First Conference of the U. S. Einstein Institute, University of Connecticut June 23, 2001.

 

24. Prueitt. P. (2001).  Foundational Paper on the Transformation of Knowledge Ecology to a Knowledge Economy, Knowledge Management Consortium Institute Journal, Vol. 1 Issue 2

 

25. Prueitt P. (2001).  Shallow Link analysis, Iterated scatter-gather and Parcelation (SLIP) and data visualization.   Army research Office Invitational Workshop on Information Assurance, George Mason University, October 2001.

 

Publications in 2002 are under non-disclosure agreements.

 

26: Prueitt P. (2003) .  Knowledge Technologies and the Asymmetric Threat.  KMPro Society Journal.  

 

Publications in 2004 are under non-disclosure agreements.

 

27: Prueitt P. (2005) .  Global Information Framework and Knowledge Management; Part 1: Published November 8, 2005 by Datawarehouse.com

 

28. Prueitt, Paul and Peter Stephenson. "Towards a Theory of Cyber Attack Mechanics." First IFIP 11.9 Digital Forensics Conference. Orlando, FL,

2005.



[1] Biopax web site is www.biopax.org
[2] See specifically work in linguistics on double articulation.
[3] See specifically the work by Karl Pribram on phase coherence arising in brain systems.
[4] See specifically the work by Gerald Edelman on neural Darwinism and selectionist theories.
[5] See specifically the work on Bell’s inequalities.
[6] See the work of Tulving and Schachner “Memory Systems 1993”.
[7] See the work by Hameroff and Penrose, on self orchestrated collapse and neural processing
 _________________________________________________________________
Subscribe/Unsubscribe/Config: http://colab.cim3.net/mailman/listinfo/soa-forum/
Shared Files: http://colab.cim3.net/file/work/soa/
Community Portal: http://colab.cim3.net/
Community Wiki: http://colab.cim3.net/cgi-bin/wiki.pl?AnnouncementofSOACoP    (01)
<Prev in Thread] Current Thread [Next in Thread>
  • [soa-forum] Computing and Social Architecture, Paul Prueitt <=