[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Composition, temporal logic and CSP.



Notes on "Composition: A Way to Make Proofs Harder" by Leslie Lamport
=====================================================================

I have now struggled through the whole of "The Temporal Logic of Actions"
which is Lamport's TOPLAS ACM paper explaining TLA (Vol 16,872, 1994). 
I confess that I skimmed a few sections, so I don't vouch for the accuracy
of these comments. But as no one else has volunteered any expertize on
temporal logic, this will have to do... 

I have to say that I do not find TLA very appealing, but as Lamport is at
pains to emphasize, any unfamiliar notation presents a obstacle and one needs
time to become comfortable. But he does seem to make a virtue of avoiding
well established mathematical notations which I found irritating.

TLA is in the same family as Unity, and of course, ordinary temporal
logic. It is really a specification language, but that includes programs.
Lamport makes great play of the fact that it is untyped: he emphasizes
that a type theory adds complexity, and there are some undecidable
situations. There is much in common with Z, except of course that
Z is typed (up to a point), and type checking in a Z specification
is in my experience a tremendous advantage: it catches so many of my
errors. He makes no mention of Z. Yet he comments that 
 "The main reason mathematical formulas seem to get very complicated
  when they get large is that mathematicians have not developed
  notations for structuring large formulas."
As I have mentioned before, a major contribution of Z is the use of
schemas as a way of structuring large specifications. But Lamport does
use -- guess what -- *indentation* for writing out predicates!
Whatever next? Folding perhaps? :-)

TLA seems to be a world of expressions evaluated on traces (he uses
this term in some later papers, but calls them behaviours earlier).
And his traces are infinite sequences of states, a state being a
mapping from a complete universe of Names-->Values. He presents an
abstract logic with a fairly small set of inference rules. And a
major aim seems to be automatic verification. The "temporal" aspect is
this inclusion of a sequence of states, and the introduction of the
"always" and "eventually" operators. They are just the obvious
quantifiers over the sequence: []F \sigma just means that F is true
for all states in the sequence \sigma. "Always" is written [], not to
be confused with ALT! The "Action" aspect is an attempt to constrain
and simplify temporal logic. An action is  a state transformation --
expressed as a predicate. In order to allow refinement, and because
state covers the whole universe (hiding is included, but hides only
variables within expressions, *not* parts of state), he needs to
permit "irrelevant" state transitions. This is what he calls
"stuttering".  But now there are state transitions/actions which make
no progress, so such specifications now have to include "liveness"
requirements. Since nondeterminism is present because specifications
will almost always omit details, the question of "fairness"
arises. Lamport asserts that TLA allows programs to be specified in a
normal form: Initial_conditions /\ next_step /\ Fairness.

[Aside:I wonder whether we can capture all forms of fairness with
priority? ]

As I understand it, the essence of TLA is to focus on that normal
form, and express all properties in that way. Then he can provide a
very small logic largely as a set of inference rules that can
potentially be automated.

>From that perspective, CSP is more like a PPL: a pseudo programming
language. It goes beyond the "pure mathematical world" by which
Lamport seems to mean just pure set theory.

He points up the problem of translating from the TLA world into that
of "real" programming languages, focusing not only on the lack of
semantics, but also on the "atomicity" of concurrent
operations. Critical sections and such. We, of course, have a
complete answer to all that. occam usage rules mean that we have no
such worries. What I do worry about is proving that compilers
correctly implement the occam/CSP primitives... 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Now to composition:
==================

As far as I can see Lamport has no mention of the algebraic semantics
that are usually used to reason about CSP and occam. And no mention of
the refinement relation. Rather he seems to be attacking methods like
the Owicki-Gries method which I have never seen used to discuss an occam
program. In so far as we might be modelling state changes in occam
rather than communication, TLA might be useful.

Lamport:
 "In pseudo-programming language terminology, a compositional proof of
   refinement (implementation) is one performed by breaking a 
   specification into the parallel composition of processes and 
   separately proving the refinement of each process."

Not usually in CSP.  Rather we would prove (perhaps using FDR) that
P1 \refines P2. And in general there may not even be a decomposition
into a nontrivial set of parallel processes.

We use the mathematical models (denotational and/or operational semantics)
to establish the algebraic laws with which to establish our
results. That is true even of FDR which is internally running essentially
an operational semantic model to establish the results (if my
understanding of FDR is correct). 

So my feeling is that his argument is little if any relevance to the 
CSP family, despite his explicit mention. I have not worked through
the later part of the second article, and it may be that at a deeper
level a CSP proof might be making use of the sort of decomposition
that he denigrates.

Later on he says:-

 "For centuries, bridge builders, rocket scientists, nuclear
  physicists, and number theorists have used their own
  abstractions. They  have all expressed those abstractions directly
  in mathematics, and have reasoned "at the semantic level". Only
  computer scientists have felt the need to invent new languages for
  reasoning about the objects they study."

I don't know what a historian of mathematics would say about that, but
from my limited knowledge of mathematical physics, I would have
thought that a very dubious statement. "Rocket scientist" - is there
any such animal outside tabloid land?

It is a pity if this reads like a counter-attack. We are basically on
the same side and would all advocate appropriate mathematical methods.
But he does appear to explicitly attack  CSP, but as far as I can
tell sets up an Aunt Sally with no connection with modern CSP. I think
that I have satisfied myself that there is no substance  in
the implied criticism, and don't have time to delve deeper just now.
Maybe someone else with more expertize can comment?

I hope this is of some use to the rest of you. I at least  have gained
a better understanding of temporal logic and TLA.

Adrian 
-- 
Adrian Lawrence.
adrian.lawrence@xxxxxxxxxxxxxx or adrian.lawrence@xxxxxxxxxxxxxxx
  MicroProcessor Unit        | Computing Laboratory,
  13, Banbury Road,          | Wolfson Building,
  Oxford. OX2 6NN.           | Parks Road, Oxford.
  UK.                        | OX1 3QD. UK.
Voice: (+44)-1865-273274,(+44)-1865-283526  Fax: (+44)-1865-273275