8 Feb 2009
Basic abstractions of interactive computation
The EBNF form to describe the syntax of programming languages is based on simple recursive substitution rules. There are different recursive substitution rules, the Chomsky hierarchy describes the four basic types of grammars which can be used to generate languages: unrestricted, context-sensitive, context-free, and regular. Yet all of these recursively enumerable languages can be recognized by a primitive serial Turing Machine. John Backus said “Conventional languages are basically high-level, complex versions of the von Neumann computer.” All programming languages are ways to express primitive serial computations for serial von Neumann computers.
If we deal with parallel, distributed, or agent-oriented computing systems, the situation becomes more interesting, and we must go beyond primitive serial machines. In order to describe this case we would not only need a single Turing Machine, but a number of coupled Turing Machines (type two Turing Machines) similar to Wegner’s interaction machines. Would it be possible to describe a language or pattern for this system by a kind of fractal? Is there a network analog of a recursive function? Is it possible to describe the interplay of liveliness and safety properties in distributed algorithms as a strange attractor? Or would it be possible to distinguish between different forms of organization in coupled Turing Machines by using the concept of emergence?
Eberbach, Goldin and Wegner say that interaction with the environment is the key difference: “Interactive computation involves interaction with an external world, or the environment of the computation, during the computation — rather than before and after it, as in algorithmic computation.” (in “Turing’s ideas and models of computation”, 2003). The interesting thing about the Turing Machine is that it is as simple as possible, and yet can describe nearly all computations and languages we currently use. Therefore it comes close to a “Theory of Everything” in computer science: it describes everything with nearly nothing. Is it already a theory of everything in computer science? Or can we find fundamtental models for other forms of computation as well? Something which captures the essence of cellular automata, neural networks and multi-agent systems, similar to the Interaction Machines from Wegner, Eberbach and Golding. Copeland talks about Hypercomputation. Stepney calls it non-classical computation or non-standard computation. Robin Milner argues that a structural theory of interaction may burst the von Neumann’s bottleneck and says that “we must find an elementary model which does for interaction what Turing’s logical machines do for computation” (see “Turing, Computing and Communication“).
Then he compares “Old Computing”..
Finite Computation – Sequential – Prescription – Instruction
..with “New Computing”
Continuing Interaction – Parallel – Description – Recognition
Is he right? To find an elementary model it is useful to identify the basic abstractions. The basic abstractions of sequential computing are code and data, which are manipulated by read & write (or load & store) operations, the essential operations of Turing Machine and von Neumann architectures:
– data structures: number, set, record, array, list
– control structures: if-then-else, loops (for, while, repeat)
– operations: read & write (or load & store, destroy & create)
Other operations, for example the classic move operation, is a combination of read and write functions. Arithmetic operations (inc, dec, add, sub, etc.) can in principle be implemented by a suitable combination, too. If a high-level program has a valid syntax can be checked with a suitable context-free grammar for the programming language. High level abstractions for sequential computation are procedures and algorithms. According to Abelson and Sussman SCIP’s book, “a procedure is a pattern for the local evolution of a computational process” (“Structure and Interpretation of Computer Programs“, chapter 1.2) Sequential computation means following a precise computational recipe – a procedure, a program, an algorithm or other step-by-step description of what to do.
Interactive, real-time computation is different. It must be fast, which requires parallel or distributed architectures. The basic abstractions of distributed computing are processes and messages, which are manipulated by receive & send operations (or perceive & act), the essential operations of agents and networks.
– data structures: messages
– control structures: events, threats (for consumers and producers)
– operations: receive & send (or perceive & act)
Instead of perceive and act one can also say classify & signal, reduce & map, collect & emit, gather & update, or propose & decide. The repeated application of the first step, perception, classification and reduction leads to abstraction. The repeated application of the second step, action, signaling and mapping, leads to dissemination and broadcasting. The broadcast operation is a combination of send operations. In a hierarchy of layers, the effect of mapping, signaling, and notification can also be considered as the opposite of abstraction, which is implementation or instantiation.
Receive and send functions can also be combined to implement all kinds of reactive behavior and to achieve a broker, dispatcher or transmitter functionality. Certain perceptions are mapped to particular actions. An essential abstraction in distributed computing is consensus, when the processes try to make a consistent decision, and agree on a common data value. Special cases:
* synchrony: agreement on a certain time
* election: agreement on an elected leader
* atomic commit: agreement on the outcome of a transaction
* total order broadcast: agreement on a sequence of messages
Now, how can we check if a high-level distributed program has a valid syntax or correct behavior? It would be fascinating if it is possible to define a fractal pattern for coupled Turing Machines to describe the strands and braids of information spreading through the system. Or is it maybe possible to use liveness and safety properties to define suitable attractors of the system?
Golding and Wegner say that the classical view of computing views “computation as a closed-box transformation of inputs (rational numbers or finite strings) to output” (see also Dina Goldin and Peter Wegner: “The Interactive Nature of Computing – Refuting the Strong Church-Turing Thesis“). The new, interactive view of computing describes computation as an ongoing interactive process. In sequential computing, the syntax of the program is checked with a suitable context-free grammar to ensure the application does not halt or stop. If the syntax is violated, then an exception would be raised and the execution of the program would come to a full stop. An object is called once, and should deliver instantly an result to any function call. An agent is called continuously, because it is embedded in the environment through a perceive-reason-act cycle. It does not need to react to every request, but it should not stop working completely. For distributed computing, even if a single node crashes the rest of the nodes would continue work. Rather than to guarantee that a single node will never stop, it is important for an embedded system to ensure that the whole system will continue to work. The interaction must go on, and the flow of information from sensors to motors (or actuators) must not stop. This can be achieved by a suitable activation or modulation of the whole system. An agent or animal which comes in a situation where absolutely no action or output is generated would be forever trapped and probably die. Therefore it is necessary to guarantee always an optimal level of activity. In the brain, there are a number of diffuse modulatory neurotransmitter systems, which work with Dopamine, Noradrenaline, Serotonin, and Acetylcholine. They are used to modulate and regulate the activity for large populations of neurons. The brainstem and the thalamocortical system guarantee always an optimal level of activity.
Furthermore it is important to ensure consistency of the system: contradictory and inconsistent processes must be avoided, because they could lead to permanent deadlocks. One way to achieve consistency is to make cognitive dissonance unpleasant. Cognitive dissonance was first investigated in Psychology by Leon Festinger in 1957. It is an uncomfortable feeling caused by holding two contradictory or inconsistent ideas simultaneously. On the contrary, cognitive consonance is a comfortable feeling. There is a tendency for individuals to seek consistency among their perceptions, beliefs and opinions: they distort the uncomfortable perception so that it does make sense, for example by reducing the importance of the dissonant beliefs, by adding more consonant beliefs that outweigh the dissonant beliefs, or by changing the dissonant beliefs to make them consistent. Or they focus on the situation and resolve the incongruity by suddenly understanding it: this resolution of incongruity is the kernel of humor and laughter.
Suitable high-level abstractions for interactive computations can perhaps be found in the area of fluid dynamics: wave, flow, modulation, sink, source, channel or conduit. Maybe it is useful to compare the computation of a distributed system to a liquid: waves of activation can spread to the system, stress bubbles can prepare the system for action, flows of information can pass through certain channels from sinks to sources, and the degree of activation and modulation can be compared to the surface level of the liquid, etc.
[…] and joining messages that pass through the network. This comes close to the principles of interactive computation, which I tried to describe earlier: data-flow, sink, source, channel, wave, modulation, .. Tags: […]
CAS-Group Blog » Blog Archive » Axum programming language
May 14th, 2009 at 8:33 pmpermalink