August 15, 2009

The Pi Calculus

Posted by John Baez

After years of suffering, I finally feel I understand the lambda calculus — a programming language so simple that it’s ideal for theoretical computer science, with beautiful connections to category theory. Now I’m ready to move on down the Greek alphabet, and ask:

What’s the pi calculus?

It’s a famous example of a process calculus. Process begins with ‘p’. That explains the ‘pi’.

But what’s a ‘process calculus’?

Well, I know that process calculi are a way of trying to formalize computations where a bunch of stuff goes on simultaneously — or as they prefer to say, ‘concurrently’. Concurrency became a big thing in computer science as soon as distributed systems like the internet became important. The all-knowing Wikipedia tells us:

In computer science, the ‘process calculi’ (or ‘process algebras’) are a diverse family of related approaches to formally modelling concurrent systems. Process calculi provide a tool for the high-level description of interactions, communications, and synchronizations between a collection of independent agents or processes. They also provide algebraic laws that allow process descriptions to be manipulated and analyzed, and permit formal reasoning about equivalences between processes (e.g., using bisimulation).

That sounds cool — and I can roughly imagine what ‘bisimulation’ means: something like two processes that are isomorphic, in that each can simulate the other. But then:

Leading examples of process calculi include CSP, CSS, ACP, and LOTOS. More recent additions to the family include the π-calculus, the ambient calculus, PEPA and the fusion calculus.

Mike Stay also likes the blue calculus.

This is a bit disheartening! Imagine you’re about to sign up for a class in calculus, and then they ask you: “So, what brand do you want to learn?” Now I know how people feel when they’re just getting started learning about $n$-categories!

In fact, if you read this review article:

you’ll see the problem is worse, in a sense, for process calculi, because people have really different ideas of precisely what these calculi should achieve. They are not all equivalent.

But I should just knuckle down and learn one: for example, the pi calculus. I can start with the Wikipedia article, or these:

They’re not bad! The pi calculus has a shortish axiomatic description, and the sources above provide intuitive explanations of what the axioms mean.

But I can’t help wanting to ‘cheat’ and find some sort of category that captures the essence of the pi calculus, in the same way that cartesian closed categories are supposed to capture the essence of the lambda calculus.

(I say “supposed to”, because this is an oversimplification! But anyway…)

Mike Stay has given me a list of papers that may help me cheat:

But it seems to me that a really good formalism for dealing with concurrent processes should have a lot in common with the applications of monoidal categories to physics. After all, doing processes in ‘series’ is just like composing morphisms, and doing them in ‘parallel’ is a bit like tensoring them.

In short, this stuff should be something I can understand!

Maybe someday…

Any help would be appreciated.

Posted at August 15, 2009 11:05 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/2037

Re: The Pi Calculus

As it happened, I was reading up on concurrent computation and the pi-calculus myself, yesterday. I’ve had “The pi-calculus: A Theory of Mobile Processes” by Sangiorgi and Walker standing on my bookshelf for some years, but never started reading as seemed to require a deeper understanding of concurrent computation than I possesed. In fact, in the introduction they suggest “Communicating and Mobile Systems: the pi-Calculus” by Robin Milner as a good introduction into the field of concurrent computation and hope their book will form a good complement.

Sangiorgi and Walker also note that neither the pi-calculus, nor currently any other process calculus, forms a complete theory of mobile systems (like lambda-calculus is for sequential computation), but is only a stepping-stone on the road to a more fundamental theory.

They assume something basic underlies the various theories. It would be interesting to see if category theory could help in shedding some light on this currently unknown theory.

Posted by: Ruud Koot on August 16, 2009 4:55 AM | Permalink | Reply to this

Re: The Pi Calculus

Ruud wrote:

As it happened, I was reading up on concurrent computation and the pi-calculus myself, yesterday.

Interesting! How many people world-wide are struggling this very moment to understand this stuff?

The Wikipedia article refers to Sangiorgi and Walker’s The π-Calculus: A Theory of Mobile Processes and Milner’s Communicating and Mobile Systems: the π-Calculus. Indeed Robin Milner is the one who invented the pi-calculus, so his book seems like a natural place to start. But I need to hike down to the library to get these books… maybe I can do it on Monday.

Sangiorgi and Walker also note that neither the pi-calculus, nor currently any other process calculus, forms a complete theory of mobile systems (like lambda-calculus is for sequential computation), but is only a stepping-stone on the road to a more fundamental theory.

Oh, good! It takes an honest writer to admit when a subject hasn’t found its proper foundation. When I first started going to theoretical computer science conferences back in the mid-1990s, everyone was talking about the need for a good theory of concurrency. Sangiorgi and Walker’s book dates to around 2001. I wonder if experts feel they’ve found this fundamental theory by now, or not. It sounds like a fun challenge.

Posted by: John Baez on August 16, 2009 5:44 AM | Permalink | Reply to this

Re: The Pi Calculus

I beg to differ: the lambda calculus is not “a complete theory for sequential computation”. There are many aspects of sequential computation that are not captured by the lambda calculus. In fact, lambda calculus by itself does not provide any notion of sequentiality because it is an equational theory, and as such does not say anything at all about how things get computed (rather it just says which things are equal). Only when combined with a particular evaluation strategy to normal form and observable side effects (non-termination, I/O) can one meaningfully discuss the order of evaluation.

Posted by: Andrej Bauer on August 16, 2009 10:21 AM | Permalink | Reply to this

Re: The Pi Calculus

Andrej wrote:

I beg to differ: the lambda calculus is not “a complete theory for sequential computation”.

I agree with you here; my quoting Ruud Koot was not meant to express agreement with the bold claim that the lambda calculus is a “complete theory for sequential computation”.

In fact, lambda calculus by itself does not provide any notion of sequentiality because it is an equational theory, and as such does not say anything at all about how things get computed (rather it just says which things are equal).

Here I think you’re being a bit unfair to the lambda calculus. This calculus contains rewrite rules, and I don’t think it’s the fault of the lambda calculus that some people have chosen to interpret these rules as equations. Indeed my LICS talk emphasized that to treat the rewrite rules fairly we need, at the very least, to study the lambda calculus using cartesian closed 2-categories instead of cartesian closed categories.

Only when combined with a particular evaluation strategy to normal form and observable side effects (non-termination, I/O) can one meaningfully discuss the order of evaluation.

I would like to study evaluation strategies using the 2-categorical approach. I would also like to see what normalization by evaluation looks like from the 2-categorical viewpoint.

I don’t know much about the formal theory of ‘observable side-effects’…

Posted by: John Baez on August 16, 2009 4:35 PM | Permalink | Reply to this

Re: The Pi Calculus

I beg to differ: the lambda calculus is not “a complete theory for sequential computation”.

I agree with you here; my quoting Ruud Koot was not meant to express agreement with the bold claim that the lambda calculus is a “complete theory for sequential computation”.

One thing the lambda calculus definitely doesn’t handle: conservation of information. As a result, it’s unable to characterize reversible computing, and isn’t particularly well-suited for quantum computation.

One paper I hugely enjoyed (and in fact periodically re-read, as it keeps yielding up insights I’d missed) is Roshan P. James’ PhD thesis under Amr Sabry, “The Computational Content of Isomorphisms”.

In short, it defines a startlingly small point-free combinator calculus, which it terms $\Pi$, which exactly characterizes reversible circuits of finite size.

It then goes on to show that adding only isorecursive types yields $\Pi^0$, which is Turing-complete.

Finally, it shows that adding fractional and negative algebraic types (with the fractional types following meadow rules, rather than field, to avoid manifesting bottom) yields the (sub-Turing) $\Pi^{\eta\epsilon}$, with traces over the additive fragment yielding finite loops, and traces over the fractional fragment yielding something startlingly like logic programming.

Intriguingly, unless information effects are handled there’s a tendency for fractional and negative types to “collapse” into a dualizing negation operator, in the style of De Morgan’s Law.

Furthermore, one can choose to view fractional types as a one-shot communication channel - this, then, yields a fascinating tool for expressing concurrent, communicating computations. Session types, then, become fractional types with further fractions in the denominator.

One thing that rather tickles me is that if one combines $\Pi^0$ and $\Pi^{\eta\epsilon}$, one can define infinite session types, which bear a startling resemblance to generalized continued fractions. In fact, the type of dynamically-terminating ping-pong benchmarks has the same form as the golden ratio $\phi$.

There have been subsequent papers exploring the topic further, including “Superstructural Reversible Logic” (Sparks and Sabry) (which gives a sequent logic formulation that corresponds to $\Pi$) and “Computing with Semirings and Weak Rig Groupoids” (Carette and Sabry), which finds that a categorical treatment of $\Pi$ that extends it based on the coherence conditions of the underlying categorical structure yields a second-level formalism, representing isomorphisms of $\Pi$ programs.

Another paper currently being worked on in a github repository is on formalizing fractional types more thoroughly.

Posted by: Alex Elsayed on April 14, 2017 8:12 AM | Permalink | Reply to this

Re: The Pi Calculus

Since you’re familiar with some of Paul-André Melliès’s work, you might find his take on concurrency interesting. For example, a relatively recent paper with Samuel Mimram, “Asynchronous games: innocence without alternation”.

Posted by: Noam Zeilberger on August 16, 2009 5:39 AM | Permalink | Reply to this

Re: The Pi Calculus

I’ll check it out. I should also ask his opinion about these matters! Whenever we talk, the conversation tends to focus on our research project.

Posted by: John Baez on August 16, 2009 11:20 PM | Permalink | Reply to this

Re: The Pi Calculus

Hello John,

I think you should have a look at Milner’s most recent work, Bigraphs. Bigraphs are a generalization of Milner’s earlier work on CCS and the Pi-calculus, enriched with categorical ideas and a nice graphical formalism. Milner explicitly states that he hopes to investigate links with natural sciences, including physics.

A recent book offers a gentle introduction to the theory. An electronic version was previously available on Milner’s web-page ; copies could still be available somewhere on the internet (Google or e-mail me for more).

Re: The Pi Calculus

Thanks, Adrien! I’ll check out bigraphs. Here’s what Milner says about them:

Bigraphs are a rigorous generic model for systems of autonomous agents that interact and move among each other, or within each other. They can be specialised to very different applications. In particular, they offer a unified platform for ubiquitous systems. They are proposed as a Ubiquitous Abstract Machine, playing the foundational role for ubiquitous computing that the von Neumann machine has played for sequential computing.

Bigraphs have evolved from process calculi, especially the calculus of Mobile Ambients (invented by L Cardelli and A Gordon) and the Pi calculus. Over-simplifying a little, the Ambient calculus models spatial reconfiguration, while the Pi calculus models reconfiguration of connectivity. Ubiquitous systems need both. Bigraphs allow agents to be structured simultaneously in these two ways (hence the `bi’ prefix); the two structures are independent, so that where you are does not dictate whom you talk to. This independence has led to an elegant theory.

To specialise Bigraphs to a particular family of systems, we impose a sorting discipline. This declares specific sorts of nodes and edges for the graphs, and may also impose a formation rule that limits the admissible graphs. By this means, Bigraphs have been specialised to represent existing process calculi, recovering their specific theories by means of its generic theory. Bigraphs can also be specialised to such different applications as the movement of agents (e.g. humans) in a built environment and the behaviour of cells and molecules in biology.

Through their graphical presentation, Bigraphs can make it easy for non-experts to visualise their system and assemble them geometrically. Through their algebraic representation they allow the derivation of languages for programming and simulation.

It would be nice to see a formalism that can be specialized to obtain other existing process calculi — it might speed up the process of learning all these process calculi! I also love graphical methods.

Posted by: John Baez on August 16, 2009 4:57 PM | Permalink | Reply to this

Re: The Pi Calculus

Yes, I think I’m going to like these ‘bigraphs’. Here’s a sample from a draft of Milner’s book The Space and Motion of Communicating Agents:

Posted by: John Baez on August 16, 2009 11:15 PM | Permalink | Reply to this

Re: The Pi Calculus

To get a quick feel of bigraphs if you already know symmetric monoidal closed categories (smcc): they are smc theories, almost.

Namely, they form a symmetric monoidal (pre)category which is closely related to the free smcc generated by:

• two sorts: $p$ for processes and $n$ for names,
• plus operations making:
• $p$ into a commutative monoid object,
• $n$ into a commutative comonoid object,
• plus a name generation operation $I \to n$,
plus additional operations, according to the process calculus you want to model.

For example, a signature for the $\pi$-calculus would include a send operation $n \otimes n \otimes p \to p$, and a get operation $n \otimes (n \multimap p) \to p$ (and possibly more). $\mathit{Send} (a, b, P)$, usually written $b\langle a \rangle.P$ is “send name $a$ on channel $b$ and then do $P$”. $\mathit{Get} (a, (b \mapsto P))$, usually written $a(b).P$, is “receive a name $b$ on channel $a$ and then do $P$.

Milner’s category of bigraphs is obtained via an ad hoc construction, which faithfully embeds into this free smcc (and is not closed).

There is a standard description of the free smcc, largely due to Todd Trimble himself, which is based on linear logic proof nets, and comes surprisingly close to Milner’s definition.

If you want to understand more, there is this paper (with Richard Garner and Aurélien Pardon): Variable binding, symmetric monoidal closed theories, and bigraphs.

But that is just bigraphical syntax: then you have to throw in reduction rules, and there you start seeing the less satisfactory aspects of bigraphs.

Posted by: Tom Hirschowitz on August 17, 2009 3:06 PM | Permalink | Reply to this

Re: The Pi Calculus

Dear Tom,

Less satisfactory in which sense? One of the really interesting aspects of the bigraphical treatment of reduction is that it starts to address the gulf between reduction-based bisimulation and labeled bisimulation. Sewell’s insight that contexts can be used as labels is key, but Leifer’s IPO construction is pretty nifty.

Best wishes,

–greg

Posted by: Greg Meredith on December 15, 2010 8:23 AM | Permalink | Reply to this

Re: The Pi Calculus

Dear Greg,

That was a duly pondered answer!

I do not remember exactly what I meant by less satisfactory, but I think it had to do with the reaction relation being only defined globally, i.e., on ground bigraphs. So, the flexibility of bigraphical syntax is only useful (w.r.t. the dynamics) for specifying parametric reaction rules, but not for reasoning locally on reductions.

Posted by: Tom Hirschowitz on December 15, 2010 8:41 AM | Permalink | Reply to this

Re: The Pi Calculus

One way to understand Pi-calculus is to compare and contrast Pi-calculus composition with categorical composition.

Categories are about things (morphisms and their attached domain and codomain) and how to compose them. If we compose

f : A -> B
with
g : B -> C
we get

(g o f) : A -> C
What’s interesting here is that the intermediate thing (B) that enables the composition, has disappeared.

Pi-calculus is also about composition, this time of processes. A process is a thing with a finite number of connection points (like domain and codomain for morphisms, except that there can be any finite number of connection points, including 0). Let’s write P(x1, …, xn) for a process with n connection points x1, …, xn. Composing say

P(abcx)
with

Q(axyz)
gives a process
P|Q
with connection points abcxyz. In other words, the shared connection points (in this example a and x) have not disappeared and remains available for further composition.

This is also what happens when you connect your computer with Google’s by making a search query: the parallel composition of your machine with Google does not prevent others from also interacting with Google.

Of course sometimes we want to prevent further composition at some composition points. The Pi-calculus enables this by providing an explicit hiding operation (nu x): given P(abc), the process

(nu a)P
has connection points b and c only.

So in some sense this aspect of Pi-calculus can be seen as decomposing the usual functional/categorical composition into two distinct parts: the pure connection aspect and the hiding aspect, both of which are given as one basic operation in category theory.

A paper that looks at this generalised form of composition in some detail is Elementary structures in process theory (1) by K. Honda.

This is not the full story of the Pi-calculus, because processes have dynamics, they evolve over time, and the connection structure can change over time.

I don’t recommend the Sangiorgi, Walker book as an introduction to the field. Good as it is, it’s too full of technical detail for an introduction.

Posted by: Martin Berger on August 16, 2009 12:59 PM | Permalink | Reply to this

Re: The Pi Calculus

Thanks, Martin, for explaining the idea of processes with more ‘connection points’ than just source and target. The image I get is of a string of beads

$\bullet \to \bullet \to \bullet \to \bullet$

$\bullet \to \bullet$

I can imagine a binary ‘composition’ operation that attaches two strings of bead end to end, and also unary ‘hiding’ operations that eliminate a given bead on a string.

Such general ways of ‘gluing things together’ are nicely described by operads, and Tom Leinster’s book studies massive generalizations of category theory based on operads and monads. I don’t know if this could be helpful here.

I’ll check out Honda’s paper.

Posted by: John Baez on August 16, 2009 4:51 PM | Permalink | Reply to this

Re: The Pi Calculus

Good, you’re getting into process algebras. That should increase the chances of my understanding them. I have had some contact with systems biology people in Kent, and I know there’s some interest there. We read an introduction by Luca Cardelli. Certainly an ambitious program.

Mind you, Samson Abramsky is looking to physics rather than biology for inspiration.

Posted by: David Corfield on August 16, 2009 1:07 PM | Permalink | Reply to this

Re: The Pi Calculus

Neat. Another tantalizing subject I wish I understood.

This one, i.e. process calculus, seems particularly close to my heart because, from my very naive understanding, it seems similar in spirit to what I was trying to do with discrete calculus.

The intersecting theme I guess is physics as a process of computation.

Posted by: Eric Forgy on August 16, 2009 5:26 PM | Permalink | Reply to this

Networks of Intelligent Agents, Some of Whom Are Liars

Claude Shannon sharpened the notion of building reliably systems from unreliable components. I’ve been working for some time on the model theory of systems of intelligent agents (not a spacial model) who are all interconnected in that any “announcement” is a public announcement that all agents can hear. The problem is how can one prove theorems about the way that agents revise their belief sets as they determine that certain other agents are certain kinds of liars. First, I have formally proved some of Raymond Smullyan’s (verbal) examples of Knights and Knaves. More importantly, what if there are conspiracies of strategic liars? There is some dispute in the Logic and Philosophy literature as to what is the proper definition of “lying and deception.” I recommend: The Definition of Lying and Deception, James Edwin Mahon. As the Global Recession suggests, markets of “intelligent agents” behave differently if some agents are trying to defraud others. This is a process logic issue, but I’m not clear how Pi Calculus refines the approach. My coauthor and I have been pushing for about 7 years to extend Information Theory into Disinformation Theory.

Posted by: Jonathan vos Post on August 16, 2009 11:10 PM | Permalink | Reply to this

Re: Networks of Intelligent Agents, Some of Whom Are Liars

Mingsheng Ying has been working on combining Pi-calculus with Shannon-style information theory. See for example:

There are a couple of follow-up papers. I don’t think this theory tradition fully solves the problem of generalising process calculus commiunication to noisy channels because of a naive treatement of the free/bound name distinction. However, it’s an interesting start.

Posted by: Martin Berger on August 18, 2009 2:16 PM | Permalink | Reply to this
Read the post Coalgebraic Modal Logic
Weblog: The n-Category Café
Excerpt: Putting together ideas of modal logic and coalgebra with ways of modelling first-order theories.
Tracked: September 7, 2009 12:50 PM
Read the post The Pi Calculus II
Weblog: The n-Category Café
Excerpt: Mike Stay begins to explain the pi calculus.
Tracked: September 8, 2009 4:52 PM

Re: The Pi Calculus

A paper on encoding knots in the $\pi$-calculus – Knots as processes: a new kind of invariant, by L. G. Meredith and David F. Snyder.

We exhibit an encoding of knots into processes in the $\pi$-calculus such that knots are ambient isotopic if and only their encodings are weakly bisimilar.

Posted by: David Corfield on September 14, 2010 12:32 PM | Permalink | Reply to this

Re: The Pi Calculus

Do you actually understand the proofs in this paper? I want to believe their result, but I already get stuck on how the first Reidemeister move maps into bisimilar processes: the loop with the twist on it seems to admit more “stuck” states than the loop without.

Posted by: Jason Reed on December 14, 2010 3:53 PM | Permalink | Reply to this

Re: The Pi Calculus

Hey Jason!

That’s great that you’re engaging in calculations! If you’ve spotted some stuck states could you share them? If we’ve screwed something up, we’d love to know about it! ;-) Feel free to post here or directly to me (lgreg.meredith@gmail.cm) or David (davidfsnyder@gmail.com), or both!

Best wishes,

–greg

Posted by: Greg Meredith on December 15, 2010 6:57 AM | Permalink | Reply to this

Re: The Pi Calculus

Dear John,

One of the principal challenges of finding good categorical models of calculi like Milner’s π-calculus (and even to some extent the lambda calculus) is that the packaging of structure and function is quite different in these formalisms.

If you think about vector spaces, they just sit there. There’s all this algebraic structure, but nothing “going on”, so to speak, no dynamics until you add morphisms. All the dynamics are in the maps. In process calculi (and other similar systems) the dynamics are packaged *with* the algebraic structure.

Specifically, process calculi can be presented in terms of a triple (G,E,S) where G is a grammar that freely generates the terms (this is really just some combination of monads); E is a structural equivalence whacking down terms that are syntactically different but not essentially different (this is a lot like the relations part of a generators (like G) and relations presentation of an algebra); S is a set of reduction rules, i.e. the operational semantics of the calculus. The (G,E) pair is a lot like a standard presentation of some algebraic structure. The S is the odd bit. It packages up dynamics over G/E.

This new packaging has some really interesting consequences. On the one hand it provides some nice ways to reason compositionally about dynamics that modify structure – compare this to Panagaden’s questions about dynamic topologies. On the other it makes it really challenging – right at the beginning – to say what a morphism might be in a category of such widgets (G,E,S). Specifically, morphisms need to encapsulate an understanding of what is preserved under mapping. It’s difficult to say what of S is or ought to be preserved. Lack of clarity on this point has been part of the difficulty in constructing a convincing categorical view.

Another interesting point is that process calculi have been presented without a compositional theory of process calculi. A given calculus presents a compositional account of concurrent behavior. No one really has a fully convincing account of how to compose (and hence decompose) process calculi as widgets themselves. Milner’s bi-graphs were an attempt to provide a basis for this. But, it’s hard to say how successful this proposal is without a decent competitor against which to evaluate it.

It turns out that this is really a practical matter. When i was at a certain large software company in the pacific northwest we built a π-calculus “chip” on an FPGA. We got interested in tiling the chip across the FPGA. This lead to an an interesting observation/question: how would we farm work out to the different chips? This lead to a proposed architecture in which we had a tree-like structure with ambient calculi at branches (because they are slightly better at farming out work than π-calculus) and π-calculi at leaves. Such an architecture begs the question of the global dynamics of such a composite “chip”. This would be more easily addressed by a compositional theory of calculi.

Best wishes,

–greg

Posted by: Greg Meredith on December 15, 2010 7:29 AM | Permalink | Reply to this

Re: The Pi Calculus

Hello,

is there some kind of conclusion to this question yet, besides the option of using bigraphs?

Greetings,
Chris

Posted by: Chris on October 15, 2011 9:06 PM | Permalink | Reply to this

Re: The Pi Calculus

I made some headway in this later post, but I gave the wrong definition for the names monad (it wasn’t even a monad) and Mike Shulman pinned me on the interpretation of types in that model. I think I know what I did wrong, but I want to be really sure I get everything right before flubbing up in a post on the cafe again.

Posted by: Mike Stay on October 16, 2011 11:15 PM | Permalink | Reply to this

Re: The Pi Calculus

Hey,

thanks for pointing out the post to me.

Greetings,
Chris

Posted by: Chris on October 21, 2011 4:19 PM | Permalink | Reply to this

CCS: transition tree of a.P|a’.Q

Hi

I recently revised CCS and my understanding is that the transition tree of “a.P|a’.Q” agent is:

Tree(a.P|a’.Q) = ({a, Tree(P|a’.Q)},{aa’,Tree(P|Q)},{a’,Tree(a.P|Q)})

where aa’ (tau) is the internal action (agent a.P communicating with agent a’.Q)

I need to understand why the transition tree should not be the following:

Tree(a.P|a’.Q) = ({aa’,Tree(P|Q)})

I thought that when an agent execute the action a (send to channel a) it blocks until the action a’ (receive from channel a) is ready and inversely.

Kind regards
–Taoufik

Posted by: Taoufik Dachraoui on December 28, 2011 8:03 PM | Permalink | Reply to this

Re: The Pi Calculus

I found the following article that has not been mentioned above Pi-Calculus in Logical Form by M. Bonsangue, and A Kurtz that ties this to coalgebras.

Abramsky's logical formulation of domain theory is extended
to encompass the domain theoretic model for picalculus
processes of Stark and of Fiore, Moggi and Sangiorgi.
This is done by defining a logical counterpart of
categorical constructions including dynamic name allocation
and name exponentiation, and showing that they are
dual to standard constructs in functor categories. We show
that initial algebras of functors defined in terms of these
constructs give rise to a logic that is sound, complete, and
characterises bisimilarity. The approach is modular, and
we apply it to derive a logical formulation of pi-calculus.
The resulting logic is a modal calculus with primitives for
input, free output and bound output.

I am only just learning about coalgebras and pi calculus, so I only just have a very superficial understanding of this….

Posted by: Henry Story on March 16, 2017 11:28 PM | Permalink | Reply to this

Post a New Comment