soa-forum
[Top] [All Lists]

[soa-forum] on coresystem, event driven architecture and cyber security

To: Sam Chance <sgchance@xxxxxxxxx>, terl@xxxxxxxxxxxxxx, Service-Oriented Architecture CoP <soa-forum@xxxxxxxxxxxxxx>
From: Paul Prueitt <psp@xxxxxxxxxxxxxxxxxx>
Date: Sat, 6 Oct 2007 11:15:55 -0500
Message-id: <E3CD5A65-5021-478C-882B-72CC37D1B9E6@xxxxxxxxxxxxxxxxxx>
Sam,  and others in the SOA CoP  ( a primary e-forum for e-gov activities)




at the blog..



Tim Bass talks about the role of security concerns.. 

From his discussion it is clear that a full conceptual foundation to SOA, in a system of many Virtual Private Networks, requires a deep packet inspection and thus that the architecture have an implicit compression or encription where the engines providing service discovery and orchestration know the compression / encription dictionaries ....  to keep privacy.   One way to do this is to provide generative mechanisms as suggested in:



The (necessary?) role of generative technology in dual provision of transparency and informational security

A service architecture having generative mechanisms and internal instrumentation (for 100% measurement of use of service objects) and deep packet inspection (as discussed by Klausner, Cox and others) provides measurement at two levels; system wide enabling a free market, and encapsulated point to point (providing a type of internal security of information context).  So the acquired content privacy is specifically of the type one sees in the NSA surveillance of messages world wide using link analysis.  One knows that link analysis occurs on all electronic transactions wide wide.  One also hopes that court action is required to "inspect" the contents of any particular message.  

To protect the rights of American citizens, the contents of transactions are unknown (due to the encription / compression) but there is a foot print over all transactions.   The foot print is then examined for structural indication of threat. There should be no mystery as to how the surveillance systems work.  



It is this type of transparency that (would) allow the operation of a free market for service computations.  

Of course, the observed transactions must be economic in nature, and devoid of individual privacy concerns.  How can this be without a consent over all participants.   Is this consent not the very nature of a marketplace, such as an open trading room?  A special part of the Internet is required (the safeNet?), is it not?

To keep privacy absolute is to defeat part of the value of service identification and orchestration.   Thus part of the implementation of service oriented computing involves the behavioral practices in a market where competition is structured by the presence of intentional data non-interoperability and secrete relationships. Secrete relationships and non-interoperability must be overcome, otherwise the benefits from SOA will not be achieved.

Proposed Service Standard for Education 

The example my group is working on is a to be proposed service transaction model for any educational institution, K -14, college or university.  In schools and in the academy we find entrenched use of paper based transactions that serve to empower deans, registrars, committees, etc to maintain power over processes; sometimes for the good but often serving private and dysfunctional intentions.  The process of virtualization of process maps is beginning to be applied by enlightened college presidents.  

I feel that there are true solutions to this problem, but that the complexity of the problem has required any solution to have a non-simple solution. Fortunately, colleges and schools have so far resisted top down IT imposition of service architectures such as (...---...), with strong arguments that the paper based system at least provides the flexibility required in "their" specific situation.  

Flexibility and bottom up definition of "service"

How can flexibility be supported so that service definition is not defined top down?  

This means having a stratification of a measurement process where invariance is gathered into formal ontology and THEN a blueprint and aggregation process produces real time models about interactions within the system.  A situational model is created through the use of underlying regularities in data exchanges in context.  Many of us know how this works but have not been able to get a market footprint. 

We need a capitalization process that is not overly predatory.  

The new information science 

Is information science correct?

The paradigm shift is mapped out, but the revolution has not yet begun.  

We may have an impedance mismatch between the types of solutions developed by the large IT vendors and solutions that are transformative of the markets.    Over the past seven years, leadership on this has not come from the government, nor from industry incumbents, even with the very high levels of funding at DARPA, NIST, NSF and OMB (e-gov).  Over the past decades there has been an experiment in allowing vested interested full control over paradigm development, through selective funding mechanisms.  

One of the core issues have to do with false requirements such as a requirement that category theory and other foundational theories not be discussed in e-gov forums.  Those in the SOA CoP forum all know what the situation is.  There is a widespread culture of extreme forms of anti-intellectualism.  This culture is reinforced in the media and in politics and inhibits a proper exposition of information science.  

I assert that there is no non-simple solution to the problems caused by an entrenchment, and also that the solution must minimally meet the level of complexity in the actual problem. To require overly simplified architecture is to preserve a large IT consulting industry's right to WORK on the problem, but without solving the problem.  The American People certainly have paid for a comprehensive solution, but they have paid the wrong consultants.  I say this thankful of the rights we as individual citizens still have.  

The Service Oriented Computing paradigm must create underlying resources. commonly available to everyone, which "cover" the semantic space and which when aggregated within a particular situation creates an ability to define a service and service fulfillment.  The particular situation may have unique aspects which may be the basis for a blueprint and for some human choice points.  

The alternative is to have no ability to interpret the possible aggregation of semantic cover elements by stakeholders.  In this case, there is no real market but a mechanical fulfillment of procurement hardwired to go to incumbents.  In the Resilience Project White Paper we make the argument that this hardwiring has occurred through the use of over 368 billion in federal expenditures since the year 2000.  According to various reports, this expenditure has been made to a small group of IT consultants as part of the e-gov activities.  



With the current machinery now in place, there can be no use of the openness of possible service providers and no market place having pure and prefect knowledge of services and consequences of using services...  hum.. sounds like Adam Smith's notion of a pure free market has been replaced by a highly structured and non-competitive mechanism that preserves the status quo.  

A hard wiring of service provision also reminds us of the government's federal and state response to natural disasters.  

Free markets?

Our current service markets are not pure free markets in the Smith sense.  Furthermore, an argument is made, by myself, that the vast expenditure (almost equal to the money spend on the wars) on IT consulting has created a dangerous concentration of wealth in the hands of a few individuals.  

As a result the market paradigm did not change with SOA.  What we see is merely one more round of IT changes without really getting a non-incremental change in market performances.  A new force meets an irresistible object.  The object does not move, yet.   SOA changed to meet the reality of a non-free and non-pure market.  Phrases like "service orchestration" or "service discovery" are redefined to mean nothing of the sort.  This is pointed out by a number of authors.  

We might all agree that a free and pure market will be more efficient and will benefit humanity greatly, providing solutions to what are now intractable problems.

Everything is not dire, however.


A full and conceptual foundation to information science


I feel that a small community has defined a SOA paradigm with a full conceptual foundation.  My sense is that the conceptual foundation is optimal and thus will be seen by more than one of us.

Moreover, the foundation is likely to be revealed in a completely functional open source form.  I look forward to watching the system evolve to President Clinton's original conceptualization of e-governance.  


Comments?  to psp@xxxxxxxxxxxxxxxxxx   

Dr Paul S Prueitt
Associate Professor and Chair of Mathematics
Talladega College














 _________________________________________________________________
Subscribe/Unsubscribe/Config: http://colab.cim3.net/mailman/listinfo/soa-forum/
Shared Files: http://colab.cim3.net/file/work/soa/
Community Portal: http://colab.cim3.net/
Community Wiki: http://colab.cim3.net/cgi-bin/wiki.pl?AnnouncementofSOACoP    (01)
<Prev in Thread] Current Thread [Next in Thread>
  • [soa-forum] on coresystem, event driven architecture and cyber security, Paul Prueitt <=