There are many different models of computation. Here's an uncommon one which could have relevance in the future.
I love programming languages. I dont know why, but whenever I hear of a new one I think, well, heres a chance to think about problems in a different way. Some languages push you towards new insights and others are just crude variations on the last popular language. Yet others are just fun.
If you have a chance, try googling esoteric programming languages, befunge, brainf**k, unlambda, or thue. There is a thriving subculture of people who enjoy designing and reading about these small, quirky languages (here Im using the word thriving as a euphemism for sporadic postings by people with interests like mine on an extremely low volume mailing list: firstname.lastname@example.org). But, hey, not all interests can be popular.
Most esoteric programming languages are designed to be weird for weirdness sake, as a sort of reclusive form of performance art, but a few weeks ago, I was reminded of an extremely interesting programming language. I dont know if anyone has branded it esoteric, but I know that Ive had more fun thinking about it than any one person should have. The name of the language is Lucid. I was reminded of it by a mention in a recent interview with Alan Kay (the father of Smalltalk). It reminded me of the weeks Id checked out a book called Lucid, the Dataflow Programming Language back when I was in school. It was great fun trying to decipher it, but most of it was beyond me back then.
So, heres Lucid..
Lucid is a non-imperative dataflow language, designed over 20 years ago by William W. Wadge and Edward A. Ashcroft. The idea is that nearly all programming languages are based in the idea of flow of control. What do we get when we try to orient around flow of data instead? One immediate reaction would be, well, I bet it is like programming with pipes and filters in UNIX", and sure enough it is, but Lucid goes a little beyond that. Lets build up a simple example.
Here is a simple program in Lucid (as I understand it, I havent found an interpreter yet)..
That was simple enough. We typed in a variable name, x. The program just sits there now until we give it a value for x. Suppose we type in 3. If we do, the program will reply with a 3. The value 3 is accepted by x, so that is what we get when the program is evaluated. Lets look at another program:
x + y
We type that in, and the program just sits there waiting, so we type in 31. More silence as the program waits.. then we type in another number: 11. The program responds with 42. Whats going on here? Well, Lucid doesn't evaluate expressions in a conventional way. It interprets an expression until it gets to a variable that it doesnt know, then it patiently waits for a value (a daton) on the input stream. Once it has all the values it needs, it advances to the next stage of computation. Sometimes that involves outputting the next value.
Interestingly, this means that expressions have state in the language. When the expression x + y sees 31, the 31 is held in memory until another value is seen. Then the two values are added together and flushed out to the output stream. Ill show a larger Lucid program a little later, but its kind of interesting to notice right off the bat that Lucid programs have persistent state by default, and they dont ever have to halt. You can pause them by cutting off the flow of data into them, but then they will just sit there, with data in them in much the same way that plumbing without pressure holds residual water.
Here is a slightly more involved Lucid program:
total = 0 fby total + x
This program computes a running total of its inputs. To understand it, you have to know that fby is a binary operator named followed by. The statement total = 0 fby total + x should be understood as the value of total starts with 0, all subsequent times it will be total + x. Because we have not given a value to x, x will pull the next value off the stream when it is evaluated.
So, far these examples have been pretty tame, but there are some very interesting/odd features in the language. One of them is an operator named next. The next operator returns the next value in a stream. So, for instance, the expression:
next a a
wont produce anything when we give it its first value, say 0, for instance. But when we give it another value, say, 1, it will produce 1. If we then give it 4, we will get 3. With next, we can have programs like this one:
sum = first(input) fby sum + next(input);
n = 1 fby n + 1;
running_avg = sum / n;
And, that isnt all. Lucid has a series of other operators. Ones like asa (read: as soon as), upon (advance upon), and whenever. They are used to pause, condition, and filter flows within programs.
Lucid isnt a very active language, to say the least, but from what Ive read it influenced a series of other data flow languages in the mid-80s. Last week, I purchased a copy of the one and only book about Lucid: Lucid, the Dataflow Programming Language (Academic Press 1985 ISBN 0-12-72965-1) from a used book seller, just to satisfy my curiosity. Sadly, the book is out of print, but there is other information on Bill Wadges website.
If you ask me why Im reading about Lucid, Id have to say that I dont know. Just curiosity, I guess. But I think there is something deeper. I really like how Lucid seems to make computation very tangible. Object orientation does this with message passing, Lucid does it with the metaphor of infinite streams of values flowing through a program. Both seem to encourage the same sort of operational thinking. The fact that Lucid is completely different makes it very intriguing to me. From what I've read, functional (Lisp-ish) thinking is as much of a drawback in dataflow programming as imperative thinking.
Practical value? I dont know, but recently there has been a lot of talk about how the pace of processor speed improvements is slowing. One of the motivations behind Lucid was to create a language that could easily take advantage of parallelism. Think of an architecture where a program is a vast network of pipes and each stage in the pipeline can be dedicated to a processor. The data flows from stage to stage with much of the computation happening in parallel. That was the dream, but this language is 20 yrs old. Im curious about what happened to this line of research.
Note: most of the examples in this blog were gleaned from Wadge and Ashcroft's book. Without having an interpreter in hand, I wasn't comfortable publishing new ones.
Of potential interest is work at Concordia University on an intensional programming environment called GIPSY. See http://newton.cs.concordia.ca/%7egipsy/ They also provide a nice definition of Lucid.
Lucid is a multidimensional intensional programming language whose semantics is based on the possible world semantics of intensional logic [1,10]. It is a functional language in which expressions and their valuations are allowed to vary in an arbitrary number of dimensions. Intensional programming (in the sense of Lucid) has been successfully applied to resolve problems with a new perspective that enables a more natural understanding of problems of intensional nature. Such problems include topics as diverse as reactive programming, software configuration , tensor programming  and distributed operating systems . However, these projects have all been developed in isolation. GIPSY aims at the integration of all these different perspectives of intensional programming.
I tried programming in a graphical data-flow language called "ProGraph" (Mac only, I believe). It was actually a mixture of text and drawing lines to connect things, and the drawing lines part was very tedious. Check out:
> If you have a chance, try googling esoteric programming > languages, befunge, brainf**k, unlambda, or thue. > There is a thriving subculture of people who enjoy > y designing and reading about these small, quirky > languages (here Im using the word thriving as a > euphemism for sporadic postings by people with interests > like mine on an extremely low volume mailing list: > <i>email@example.com</i>). But, hey, not all > interests can be popular.</p>
INTERCAL cannot be missed when someone talks on this subject...
Described by a manual that circulated for years after the short life of the first implementation, reducing strong men to tears (of laughter). Revived in 1990 by the C-INTERCAL compiler, and now the center of an international community of technomasochists.
is what ESR says in his website - "http://www.catb.org/~esr/intercal/"
One of the motivations behind Lucid was to create a language that could easily take advantage of parallelism. Think of an architecture where a program is a vast network of pipes and each stage in the pipeline can be dedicated to a processor. The data flows from stage to stage with much of the computation happening in parallel. That was the dream, but this language is 20 yrs old. Im curious about what happened to this line of research.
This idea of dataflow programming is very much alive in the HDL (Hardware Description Language) world. In fact HDL descriptions are often referred to as dataflow models, depending on how the model is written (there are also behavioral and structural levels of modelling as well). The two most prominent HDL's are VHDL and Verilog. HDLs are used to model and simulate hardware. In the 'old days' circuits used to be described using schematics which visually show the connection of wires between gates, for the last 10+ years or so HDLs have come to replace schematics for digital logic design. As you can imagine, gates in a chip are always working in parallel on their inputs, so HDL's provide a way to model something which is highly parallel: digital logic in a chip (ASICs, FPGAs,etc.)
In VHDL, for example, you can have several process statements. Each process has a sensitivity list (a list of signals which will trigger execution of the process when they change). So for example, you could have two processes like so:
process(clk) if(clk'event and clk = 1) then if(rst = 1) then count <= 0; --reset count else count <= count + 1; --increment count end if; end process;
process(count) if(count = 100) then signal_100 <= 1; else signal_100 <= 0; end if; end process;
Conceptually, both of these processes are considered to be executing in parallel waiting for signals in their sensitivity lists to change. The first process would be triggered by a change in the signal 'clk' and would either reset or increment the 'count' signal depending on the condition of the 'rst' signal (a synchronous reset). The second process will be triggered everytime 'count' changes value. There could be any number of these concurrent processes operating in parallel. This is of course a very simple example; you could have much more complex interactions, considering that VHDL is used for designing processors and even very complex communicating state machines.
If you're interested in playing with an HDL implemented on top of Ruby you can check out my RHDL package here: http://www.aracnet.com/~ptkwt/ruby_stuff/RHDL/ It uses continuations to implement VHDL-like processes as described above (including sensitivity lists). While I originally created RHDL to model and simulate hardware, I actually think that some of these concepts would be valuable in software development as well (especially as our computers begin to have many processors as in the Cell architecture).
Dataflow is perhaps one the least appreciated and acknowledged forms of program structure. The recursive reference nature of dataflow can provide a complication to the understanding of the program flow.
I think that I would prefer to write the program linearly, and then have a dataflow reduction be performed on that program for execution in a dataflow OS/environment.
The Lucid based examples shown here and in some of the references show simple concepts. More complex examples show how difficult it can be to maintain coherent program structures. This is not just a Lucid issue, but it seem that the complexity curve can be much steeper per line of code.