Jump to content

Talk:Denotational semantics

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Sam Staton (talk | contribs) at 20:45, 16 October 2009 (Reply to Carl: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Actors, concurrency etc

Hi. I have moved the material giving details on concurrency and actor semantics back to the page on Denotational semantics of the Actor model. This page has to give a high level overview of the subject. Actors, concurrency and interaction get a fair mention as it is, and there are links for those readers who want to know more. No offense intended to the editor who moved the material here. Sam (talk) 20:43, 9 June 2009 (UTC)[reply]

As you should know from your previous work, sequential and functional denotational semantics are special cases that are interesting mainly in how they illustrate and motivate the general case.75.10.249.77 (talk) 03:18, 11 June 2009 (UTC)[reply]

Current article is a mess

The current article is a mess. It treats all kinds of special cases in a disorganized fashion and doesn't properly develop and motivate the general case.67.180.94.147 (talk) 16:57, 13 June 2009 (UTC)[reply]

The problem seems to have been resolved.70.231.253.115 (talk) 19:13, 21 June 2009 (UTC)[reply]
Perhaps unfortunately, the current state of the subject is a little disorganized, so this is unavoidable. The idea that the Actor Model is the "general case" is a valid position but it is not universally held. Sam (talk) 18:16, 2 September 2009 (UTC)[reply]
Unfortunately, Sam has imposed his very narrow notion of compositionality in terms of the composition of continuous functions. This is entirely insufficient in general. Someone should restore the more general case and includes Sam's new material as a special case.71.198.220.76 (talk) 19:45, 2 September 2009 (UTC)[reply]
As I have written, the approach is concerned with composition of morphisms in a category, and has little to do with continuous functions per se. This is the general approach to denotational semantics that is taken by Abramsky, Girard, Gunter, Hennessy, Plotkin, Stoy, Tennent, Winskel, and many others. It is the style of denotational semantics that is used in logic, especially categorical logic, see for instance the books by Lambek and Scott, Jacobs, Johnstone. You might be interested to note that Hildebrandt uses this style in his semantics of fairness. It is certainly not a narrow notion, nor a special case. Sam (talk) 19:33, 3 September 2009 (UTC)[reply]
I wonder whether your ideas might actually fit into this framework... I will think about it. Sam (talk) 19:39, 3 September 2009 (UTC)[reply]
The undue emphasis on the actor model comes from the days when Carl Hewitt was editing here. Feel free to give it proper WP:WEIGHT. The "further reading" list could use some filtering as well. You are certainly more knowledgeable than me in this area. Pcap ping 23:54, 6 September 2009 (UTC)[reply]

Compositionality

The article mentions that the denotations of programs should be compositional in terms of their subphrases but doesn't explain how this is accomplished.67.180.94.147 (talk) 17:09, 13 June 2009 (UTC)[reply]

Compositionality previously appeared in the article as follows:
Compositionality in programming languages
An important aspect of denotational semantics of programming languages is compositionality, by which the denotation of a program is constructed from denotations of its parts. For example consider the expression "<expression1> + <expression2>". Compositionality in this case is to provide a meaning for "<expression1> + <expression2>" in terms of the meanings of <expression1> and <expression2>.
Scott and Strachey [1971] proposed that the semantics of programming languages be reduced to the semantics of the lambda calculus and thus inherit the denotational semantics of the lambda calculus. However, it turned out that general computation could not be implemented in the lambda calculus (see Indeterminacy in concurrent computation). Thus there arose the problem of how to provide modular denotational semantics for concurrent programming languages. One solution to this problem is to use the Eval messages with an environment so that programs obtian their denotational semantics from the methods explained earlier in this article.
Environments
Environments hold the bindings of identifiers. When an environment is sent a Lookup message for an identifier x, it returns the latest (lexical) binding of x.
As an example of how this works consider the lambda expression <L> below which implements a tree data structure when supplied with parameters for a leftSubTree and rightSubTree. When such a tree is given a parameter message "getLeft", it return leftSubTree and likewise when given the message "getRight" it returns rightSubTree.
                 λ(leftSubTree, rightSubTree)
                     λ(message)
                            if (message == "getLeft") then leftSubTree
                            else if (message == "getRight") then rightSubTree
Consider what happens when an expression of the form "(<L> 1 2)" is sent an Eval message with environment E. One semantics for application expressions such as this one is the following: <L>, 1 and 2 are each sent Eval messages with environment E. The integers 1 and 2 immediately reply to the Eval message with themselves.
However, <L> responds to the Eval message by creating a closure C that has a body <L> and environment E. "(<L> 1 2)" then sends C the message [1 2].
When C receives the message [1 2], it creates a new environment F which behaves as follows:
  1. When it receives a Lookup message for the identifier leftSubTree, it responds with 1
  2. When it receives a Lookup message for the identifier rightSubTree, it responds with 2
  3. When it receives a Lookup message for any other identifier, it forwards the Lookup message to E
C then sends an Eval message with environment F to the following:
                 λ(message)
                       if (message == "getLeft") then leftSubTree
                       else if (message == "getRight") then rightSubTree
Arithmetic expressions
For another example consider the expression "<expression1> + <expression2>" which has the subexpressions <expression1> and <expression2>. When the composite expression receives an Eval message with an environment E and a customer C, it sends Eval messages to <expression1> and <expression2> with environment E and sends C a newl created C0. When C0 has received back two values N1 and N2, it sends C the value N1 + N2.
Other programming language constructs
The denotational compositional semantics presented above is very general and can be used for functional, imperative, concurrent, logic, etc. programs.[1] For example it easily provides denotation semantics for constructs that are difficult to formaize using other approaches such as delays and futures.
205.158.58.226 (talk) 22:49, 14 June 2009 (UTC)[reply]
Good point. I've had a go at explaining the usual approach to compositionality. This is quite general and might even include your ideas to a certain extent. Sam (talk) 18:16, 2 September 2009 (UTC)[reply]

Progressivity

Progressivity is different from constructivism. Constructivism is concerned with whether domain elements can be shown to exist by constructive methods. Progressivity[2] is concerned with the time progression of systems.67.180.94.147 (talk) 17:25, 13 June 2009 (UTC)[reply]

  1. ^ Carl Hewitt. Common sense for concurrency and strong paraconsistency using unstratified inference and reflection Arxiv 0812.4852. 2008.
  2. ^ "What is Commitment? Physical, Organizational, and Social" LNAI 4386. Springer-Verlag. 2007.


Progressivity previously appeared in the article as follows:
  1. Progressivity:[1] For each system S, there is a progressionS function for the semantics such that the denotation (meaning) of a system S is i∈ωprogressionSi(⊥S) where S is the initial configuration of S.
205.158.58.226 (talk) 22:37, 14 June 2009 (UTC)[reply]

Denotational Semantics of Exceptions (was Division by Zero)

User 162.135.0.12 added statements about division by zero being a problem for compositionality in denotational semantics. It is true that division by zero has to be handled carefully in computer science. As far as I am aware, though, this is not generally regarded as one of the most serious problems for denotational semantics. I think that mentioning it here gives it undue weight. If I am mistaken, please provide a reference to a textbook. Sam (talk) 07:58, 15 September 2009 (UTC)[reply]

Sam, Unfortunately, you are dead wrong. Exceptions are crucial to Software Engineering. You might want to take a tutorial on Software Engineering. --70.132.17.91 (talk) 02:11, 16 September 2009 (UTC)[reply]
Of course, I know that exceptions are extremely important in programming and software engineering! I am just not sure that they have historically played a crucial and fundamental role in denotational semantics. (We can handle exceptions with the exceptions monad, for instance.) Sam (talk) 07:11, 16 September 2009 (UTC)[reply]
Sam: Unfortunately, use of the exceptions monad forces sequential execution, which is unacceptable. You are mistaken in your assertion that the functional approach to compositionality of programs works in general.70.132.17.91 (talk) 22:24, 16 September 2009 (UTC)[reply]
Yes, I know that in the concurrent context, exceptions are difficult to get right in a language design. I didn't mean that the simple exceptions monad was a universal solution! However, I think the page is at a stage that reflects the current status of textbooks, undergraduate/graduate courses etc. and in my view this is what is needed in a wikipedia page (regardless of how non-general this might be). There is definitely some academic research still to be done, though. Can we continue this discussion somewhere else, more appropriate? eg by email or on your blog? I would very much like to understand your objections to the conventional notions of compositionality, and I don't understand your why you are referring to this as "functional". I would also be interested in discussing semantics for ActorScript. Cheers, Sam (talk) 18:57, 17 September 2009 (UTC)[reply]
If you knew that the exceptions monad doesn't work, why did you suggest it? There is no excuse for Wikipedia to have obsolete incorrect information. You might contact Professor Hewitt with your questions about ActorScript. 70.132.17.91 (talk) 22:18, 17 September 2009 (UTC)[reply]
Uh? The exceptions monad is not obsolete or incorrect. Sam (talk) 10:24, 18 September 2009 (UTC)[reply]
Sam,
Unfortunately, you are wrong again. The exceptions monad is not faithful because it forces sequential execution. For example, using the exceptions monad for an expression of the form f(E1, ..., En) forces sequential execution of E1, ..., E1. 12.234.41.239 (talk) 18:07, 18 September 2009 (UTC)[reply]
But that is exactly the behaviour specified by modern mainstream languages such as Java or C#. Sure, some languages are deliberately nondeterministic (often for rather dubious reasons), but that's a different issue. Hqb (talk) 18:55, 18 September 2009 (UTC)[reply]
Unfortunately, the above opinion has been rendered totally obsolete because of the advent of many-core architecture. Also, it is very important not to confuse concurrency with nondeterministic computation. 12.234.41.239 (talk) 19:12, 18 September 2009 (UTC)[reply]
I'm not sure what "opinion" you refer to there. It's a fact that Java and C# explicitly require left-to-right evaluation (or do you dispute that?), and I wouldn't expect that aspect of their specification to change in the foreseeable future. (Again, if you have well-sourced information to the contrary, please cite.) The point is, as long as exception monads faithfully model the semantics of today's most popular languages, it's hard to dismiss them as "obsolete" without getting into crystal ball territory. Hqb (talk) 19:46, 18 September 2009 (UTC)[reply]

It is widely recognized that the many-core revolution has rendered Java and C# obsolete. So a new generation of programming languages (e.g. ActorScript) have been developed that integrate local and nonlocal computation for client-cloud computing.12.234.41.239 (talk)

Monads in general are not rendered obsolete by the ever-increasing ubiquity of parallel computing. To the contrary, GHC can leverage the provable state separation between different monads to automatically parallelism execution; see for instance [1]. There are quite a few proposals in this area. An overview of the more practical ones is here. So I'm not sure what Carl Hewitt's point is. Do you want exceptions to run concurrently with the code/thread that generated them? What would be purpose of that? Pcap ping 00:46, 19 September 2009 (UTC)[reply]
The consensus is that a procedure invocation should evaluate its arguments concurrently. In ActorScript notation, this means that f(E1, ..., En) is equivalent to f <- [E1, ..., En] Unfortunately monads are obsolete.71.198.220.76 (talk) 18:12, 20 September 2009 (UTC)[reply]
"The consensus is that a procedure invocation should evaluate its arguments concurrently". Probably in true you research group, but I strongly doubt such universal consensus exists even amongst PLT researchers. You can easily prove me wrong: start a thread on LtU on this. Pcap ping 18:44, 20 September 2009 (UTC)[reply]
In this day and age, there is no realistic alternative to concurrent execution. 71.198.220.76 (talk) 20:17, 21 September 2009 (UTC)[reply]

Material in "Compositionality in Programming Language" removed by Sam Staton

An important aspect of denotational semantics of programming languages is compositionality, by which the denotation of a program is constructed from denotations of its parts. For example consider the expression "8 / 4". Compositionality in this case is to provide a meaning for "8 / 4" in terms of the meanings of "8", "4" and "/".

Functional approach

A functional approach to denotational semantics in is compositional because it is given as follows. We start by considering program fragments, i.e. programs with free variables. A typing context assigns a type to each free variable. For instance, in the expression (x / y) might be considered in a typing context (x:nat,y:nat). We now give a denotational semantics to program fragments, using the following scheme.

  1. We begin by describing the meaning of the types of our language: the meaning of each type must be a domain. We write 〚τ〛 for the domain denoting the type τ. For instance, the meaning of type nat should be the domain of natural numbers: 〚nat〛=ℕ.
  2. From the meaning of types we derive a meaning for typing contexts. We set 〚 x11,…, xnn〛 = 〚 τ1〛× … ×〚τn〛. For instance, 〚x:nat,y:nat〛= ℕ×ℕ. As a special case, the meaning of the empty typing context, with no variables, is the domain with one element, denoted 1.
  3. Finally, we must give a meaning to each program-fragment-in-typing-context. Suppose that P is a program fragment of type σ, in typing context Γ, often written Γ⊢P:σ. Then the meaning of this program-in-typing-context must be a continuous function 〚Γ⊢P:σ〛:〚Γ〛→〚σ〛. For instance, 〚⊢8:nat〛:1→ℕ is the constantly "8" function, while 〚x:nat,y:natx/y:nat〛:ℕ×ℕ→ℕ is the function that divides two numbers.

Now, the meaning of the compound expression (8/4) is determined by composing the three functions 〚⊢8:nat〛:1→ℕ, 〚⊢4:nat〛:1→ℕ, and 〚x:nat,y:natx/y:nat〛:ℕ×ℕ→ℕ.

There is nothing specific about domains and continuous functions here. One can work with a different category instead. For example, in game semantics, the category of games has games as objects and strategies as morphisms: we can interpret types as games, and programs as strategies. For a simple language without recursion, we can make do with the category of sets and functions. For a language with side-effects, we can work in the Kleisli category for a monad. For a language with state, we can work in a functor category.

Problems with the functional approach to compostionality in programming languages

Unfortunately, the above is not a general scheme for compositional denotational semantics that is adequate for modern software engineering. An initial difficulty comes with expressions like 8/0. An inital approach was to define a special element of the domain to be an "error element" and have 8/0 denote the error element. However, the error element approach is unsatisfactory for software engineering. One problem is that it becomes necessary for programs to continually test for the error element. Another problem is that the error element approach does not automatically propagate exceptions higher if they are not caught.

An even more serious problem with the above functional approach is that it does not encompass communication and change. Consider the following program in ActorScript:

SimpleCell ≡
serializer
contents
implements Cell
Read< > → return contents
Write<newContents> → return also become(contents=newContents)

What is the denotation of the expression "return also become(contents=newContents)" in the above program?

Scott and Strachey [1971] proposed that the semantics of programming languages be reduced to the semantics of the lambda calculus and thus inherit the denotational semantics of the lambda calculus. However, it turned out that general computation could not be implemented in the lambda calculus (see Indeterminacy in concurrent computation). Thus there arose the problem of how to provide modular denotational semantics for general programming languages. One solution to this problem is to use the Eval messages with an environment so that programs obtain their denotational semantics from the methods explained earlier in this article.

General compositionality in programming languages

The general approach to compositonality in programming languages solves the above problems using environments and customers as explained below.

Environments

Environments hold the bindings of identifiers. When an environment is sent a Lookup message for an identifier x, it returns the latest (lexical) binding of x.

As an example of how this works consider the lambda expression <L> below which implements a tree data structure when supplied with parameters for a leftSubTree and rightSubTree. When such a tree is given a parameter message "getLeft", it return leftSubTree and likewise when given the message "getRight" it returns rightSubTree.

 λ(leftSubTree, rightSubTree)
   λ(message)
     if (message == "getLeft") then leftSubTree
     else if (message == "getRight") then rightSubTree

Consider what happens when an expression of the form "(<L> 1 2)" is sent an Eval message with environment E. One semantics for application expressions such as this one is the following: <L>, 1 and 2 are each sent Eval messages with environment E. The integers 1 and 2 immediately reply to the Eval message with themselves.

However, <L> responds to the Eval message by creating a closure C that has a body <L> and environment E. "(<L> 1 2)" then sends C the message [1 2].

When C receives the message [1 2], it creates a new environment F which behaves as follows:

  1. When it receives a Lookup message for the identifier leftSubTree, it responds with 1
  2. When it receives a Lookup message for the identifier rightSubTree, it responds with 2
  3. When it receives a Lookup message for any other identifier, it forwards the Lookup message to E

C then sends an Eval message with environment F to the following:

   λ(message)
     if (message == "getLeft") then leftSubTree
     else if (message == "getRight") then rightSubTree

Arithmetic expressions

For another example consider the expression "<expression1> / <expression2>" which has the subexpressions <expression1> and <expression2>. When the composite expression receives an Eval message with an environment E and a customer C, it sends Eval messages to <expression1> and <expression2> with environment E and sends C a newly created C0. When C0 has received back two values N1 and N2, it sends "/" the message [N1 N2] with customer C to perform the division.

Other programming language constructs

The denotational compositional semantics presented above is very general and can be used for functional, imperative, concurrent, logic, etc. programs.[2] For example it easily provides denotation semantics for constructs that are difficult to formalize using other approaches such as delays and futures.

  1. ^ Cite error: The named reference Hewitt06 was invoked but never defined (see the help page).
  2. ^ Carl Hewitt. Common sense for concurrency and strong paraconsistency using unstratified inference and reflection Arxiv 0812.4852. 2008.

76.254.235.105 (talk) 02:59, 16 September 2009 (UTC) [reply]

Reply to Carl

Hi, as you know, wikipedia has to report the generally accepted state of the art, regardless of what the most general case or the very best approach might be. This means that we have to report what is in the textbooks and the university courses, and in the style that is usually reported at scientific conferences, rather than what is in your very recent preprints. We can discuss how things should be at a different forum -- you can come to visit Cambridge if you like. Anyway, that is why I'm about to put the page back to how it was before.

Let me just point out that the usual way of handling compositionality, that is used by almost all researchers in denotational semantics, can be made so general as to be almost vacuous. I agree that classical domain theory doesn't work for everything, that is well known, but the usual approach to compositional semantics has actually nothing to do with functions or domains. Whenever you have a compositional denotational semantics, you can fit it into this framework, as follows.

Consider an untyped language for simplicity. Then build a category: objects are natural numbers. A morphism m -> n is an n-tuple of programs that have m free variables. We now equate two morphisms m->n if they have the corresponding programs have the same denotation (so a morphism can be thought of as a denotation.) The composition of two morphisms is built by substituting programs for variables.

This all makes sense provided the semantics respect substitution, roughly:

  • if〚 x ⊢ P 〛 = 〚 x ⊢ P' 〛 and 〚 y ⊢ Q 〛 = 〚 y ⊢ Q' 〛 then 〚 y ⊢ P[Q/x] 〛= 〚 y ⊢ P'[Q'/x] 〛.

Do you see what I mean? Sam (talk) 20:45, 16 October 2009 (UTC)[reply]