Mon 27 Mar 2006
Today, I had the great privilege of attending a conference call in which Keith Harrison-Broninski was presenting the concept of a Work Processor.
He argued that while production and transaction processes are well supported by enterprise software such as ERP, Workflow, and BPM, a new type of software is needed to support collaborative human work processes, which depend on interaction and are dynamically shaped by the participants.
This software tool, the work processor, would be based on a theory of human collaboration known as Human Interaction Management (HIM) that Harrison-Broninski has introduced in his seminal book “Human Interactions: the Heart and Soul of Business Process Management”.
The upcoming humanedj will be a proof-of-concept of the kind of Human Interaction Management System (HIMS). This free software will be implemented as a peer-to-peer client program designed for anywhere-use. Each client is a kind of agent acting a role in the process. The process is not predetermined but it unfolds as the work is carried out, when the process users continually negotiate agreements on “how the work is to proceed from now on”.
Harrison-Broninski rehearsed his prediction that SOA and BPM are moving away from orchestration and towards choreography, defining what systems will do in preference to how they actually do it. As the users of a business application/process often do not have the same management, the participants need to agree on roles, interactions, obligations and responsibilities. Further, such agreements need to be continually re-defined as the business environment changes. This is where the work processor steps in.
See also: Orchestration, choreography, and… impromptu.
Fri 24 Mar 2006
BEA held a seminar on Enterprise Service Bus jointly with Mercury and Systinet (recently acquired by Mercury) this morning. They subscribed to Gartner’s definition of ESB as the “Web services capable middleware infrastructure that supports communication and mediates application interactions” and argued about it’s role and need in a Service-Oriented Architecture (SOA).
BEA’s new AquaLogic Suite represents a paradigm shift from application infrastructure (WebLogic) to service infrastructure. According to BEA, this paradigm shift manifests the following trends:
- functionality-oriented -> process-oriented
- designed to last -> designed to change
- long development cycle -> interactive, iterative development
- cost-centered -> business-centered
- application block -> service orchestration
- tightly coupled -> agile and adaptive
- homogeneous technologies -> heterogeneous technologies
- object-oriented -> message-oriented
- known implementation -> abstraction
As the level of abstraction raises to the enterprise level, IT is more about configuration than development and stateless brokering rather than process-level management.
The most interesting presentation was made by Systinet on SOA Governance and the pivotal role of a corporate-level service registry. It was argued that the benefits of SOA cannot be achieved without appropriate governance. Ungoverned SOA comes with high costs due to:
- lack of reuse by compromising trust
- process disruption caused by service outages
- escalations in help desk and field support costs
- IT, business and regulatory noncompliance
- information access and security breaches
- overall SOA failure by allowing chaos to reign
The need for a single authoritative source for all metadata including taxonomies, policies, specifications and capabilities was accentuated. The bottom line was that SOA + governance = flexibility and consistency.
My question about the expected synergies of BEA’s Fuego acquisition went practically unanswered, but they promised to come back to the question in one month on BEA’s upcoming seminar on BPM and business rules.
Wed 22 Mar 2006
Posted by jjk under Technology1 Comment
For some time, there has been discussion about the distinction between the concepts of orchestration and choreography in Business Process Management (BPM). I would summarize the consensus of these definitions as follows:
Orchestration is an imperative formal description of the sequence and conditions in which an executable process invokes services and interacts with other processes in order to achieve its design objectives. It specifies how services are hierarchically composed of context-independent services by binding them to the context of the orchestration. Orchestration describes how things are done.
Choreography is a declarative formal description of the coordination between multiple participants, specifying their roles and observable message exchange. It is a contract governing the public (i.e., externally observable) behavior across multiple, independent parties as a set of messages and the sequencing of messages to accomplish a holistic goal, typically completion of a business transaction. Choreography describes what is to be done (and by whom).
Putting? my two cents in, I would view the distinction between orchestration and choreography as reflecting a more fundamental demarcation between the notions of control and coordination, respectively. Orchestration occurs in the scope of a single control; the orchestrated services are subordinate to the composite service. The control cannot be extended to “lateral” services that are outside of the control scope of the orchestration, but a coordination mechanism is needed. Choreography is a contract, or protocol, governing this coordination.
Choreography is applicable to relatively stable collaborative processes such as B2B order management in which the parties agree upon the public process specifying the externally observable message interchange while managing their private processes individually. However, it is not flexible enough for adaptive and dynamic collaborations in which the interaction patterns cannot be anticipated.
Such collaborative processes include creative, innovative human activities such as research, design and project management. The activity sequencing of these processes cannot be prescriptively imposed, but the contract of interactions, deliverables and business rules is continually renegotiated during the life of the process. Human Interaction Management (HIM) has challenged the traditional BPM approach in managing this kind of “impromptu” processes.
Tue 21 Mar 2006
Professor Heikki Hyötyniemi gave an interesting, interdisciplinary lecture on Neocybernetics at the annual meeting of The Finnish Society for Natural Philosophy tonight.
First he rehearsed the basics of Hebbian learning and argued for “semantics through substance”: symbolic grounding is hermeneutic and emphasizes relevance over truth. He then used the Hebbian model as a starting point to holistic considerations generalizing to other domains such as ecology, economy and cognition. “Evolution is equally cruel in all environments,” as he puts it. The basic tenets were that:
- The details are abstracted away — a holistic view.
- There exist local actions only, no structures of centralized control.
- Underlying interactions and feedbacks are consistent.
- One can assume stationarity and dynamic balance in the system in varying environmental conditions.
- Linearity is pursued as long as it is reasonable.
One of the important points was that as cybernetic systems perform “pattern matching”, the process can be substituted with the final pattern. Adaptation processes are very different but the end states are unique and generally characterizable.
Unfortunately, Hyötyniemi skipped the mental system in his presentation due to time constraints, but the exemplary ecological simulations were also of interest. He provided some mathematical evidence of the robustness and biodiversity of ecological systems. Control in these systems is neither centralized or distributed (in traditional sense), but the coordination occurs through environment.
The natural philosophy was addressed in the true spirit of Heraclitus. I somewhat missed Hyötyniemi’s account on the relationship of information and material flows in a cybernetic system, but understood that information and matter are tightly intertwined and that “the natural system is a model” as much as “the model is a natural system”. He also brought in that the nature is more like work of a “hardworking idiot” than an “intelligent designer”.
Fri 17 Mar 2006
Posted by jjk under Technology1 Comment
Semantic Web tumbles on the age-old symbol grounding problem put forth by Harnad:”How can the semantic interpretation of a formal symbol system be made intrinsic to the system, rather than just parasitic on the meanings in our heads?” In other words, how can the meaning of a term be grounded to anything but other terms that are inherently meaningless? The meaning of symbols is designated by humans, external to the symbol system. Any attempt to make such a formal symbol system useful in a broader context results in rigorously unambiguous and exhaustive ontologies with little practical use in the ambiguous reality.
It is unrealistic, if not naïve, to assume that the organically grown Web could be transformed into a neat, well-defined, well-behaving Semantic Web, normatively imposing the semantics of web resources.
The Web 2.0 approach is more pragmatic. In contrast to formal classification methods, folksonomy refers to collaborative efforts of human communities to organize information, typically by tagging concepts with associative terms, social bookmarking being one of the most popular applications. This is more in line with the idea of society of mind, postulating that a naturally evolved cognitive system consists of processes known as agents.
In Web 2.0, concepts are defined in social interaction and natural functional associations to other concepts are formed. There is no formal, machine-interpretable? representation? of concepts and their relationships, no automated logical deduction of propositions. But then again: is it really needed?? Human? collaboration in collectively? classifying and defining concepts has proven to be very powerful. Wiki is? a paradigmatic example of the architecture of participation: the users contribute to the system as an epiphenom of using it.? As people ultimately define and interpret the semantics of a symbol system, why not regard them as the very parts of the system in the first place? Web 2.0 is inherently grounded to external reality through “human transduction”.
That said, Semantic Web could redeem its promise, if it took organically defined and naturally grounded semantics of Web 2.0 as? its computational primitives. The missing piece seems to be a mechanism to translate folksonomic semantics into a machine-interpretable formalism.
Thu 16 Mar 2006
Moblogging on my Nokia 6610i with Opera Mini browser from Oracle/Telelogic seminar on Business Process Modeling.
Talk about “Closed Loop BPM”. Recognition of business process as different from orchestration. Distinction between high-level BPMN process modeling and lower-level BPEL process implementation. Choreography and collaborative processes not addressed.
Sun 12 Mar 2006
Posted by jjk under Technology1 Comment
Back in 1990, I was visiting an exhibition on Artificial Intelligence at Heureka. I was astonished at a “thinking machine” that responded more or less intelligently to my typing on a terminal. For a brief, exhilarating moment I believed I was dealing with real artificial intelligence. The machine passed my Turing test.
Too soon, however, I realized how this “artificial intelligence” was achieved. On the other side of the room, there was another terminal, identical to the one I was using, and someone gigglingly typing sophisticated answers to my questions. I was disillusioned. Deluded. Artificial intelligence was not real, it was… artificial.
I was reminded of this traumatic scam just recently when I came across with Amazon’s Mechanical Turk. It is a web service providing a programmable interface to a network of humans to solve problems requiring human intelligence. Such Human Intelligence Tasks (HIT) include but are not limited to finding specific objects in pictures, evaluating beauty, or translating text. All software developers need to do is write normal code that calls mturk web service API. Behind the scenes, a network of humans completes tasks and receives payment for work.
In a way, this kind of “artificial artificial intelligence” passes Turing’s test: you can not tell whether the AI is artificial or not. In the wake of Service-Oriented Architecture (SOA), the distinction between automated and manual tasks is blurring. It is irrelevant whether the task is accomplished by computer automation or a human intellect. The requestor makes no assumptions about the internal implementation of the service as long as the service behaves as specified in the service description.
As the software rises above the level of a single device, not only do humans use computer systems. The computer systems, increasingly, use humans.
Fri 10 Mar 2006
Today, at last, I obtained a virtual home, my own personal web page.
I started furnishing it by installing this weblog, or blog. Why? I have never felt compelled to publish a static home page, but plugging into the rapidly growing Blogosphere of 30.2 million blogs — it is essentially different.
A blog, it seems to me, instigates dialogue, both with others and within oneself. It disciplines thinking and self-directed learning. It allows me to express trains of thought that would otherwise fade away, give them an organized, rhethorical form. It makes me think. It makes me communicate.
It makes me a cog in the machine, yet another synapse in the massive computation of this glass bead game.