Discuss: Causal Commutative Arrows
Read this paper: http://haskell.cs.yale.edu/wpcontent/uploads/2011/01/ICFPCCA.pdf
Add 23 comments or questions below.

What is meant by the statement in the abstract that commutative arrows "capture a kind of noninterference property of concurrent computations"? In particular, what could be an example of a computation where effects interfere with each other and how does commutativity solve this?
How do monads impose stringent linearity

Status changed to closed
Toggle commit list 
Status changed to reopened
Toggle commit list 
Would you please explain Figure 9? Thank you!

By casual computation it means that output depends on current input + previous input. init operator is being suggested in place of Delay operator like iPre. Not sure why these are termed as being stateful though they seem pure?
Reference  Section 3. Casual Commutative Arrows

I think synchronization is very important to the structures in this paper. But synchronous languages are only simply mentioned. How can common synchronization issues be solved, such as deadlock and race conditions?

This paper relies a lot on an understanding of arrows which I do not really have. Can we go over some more concrete examples on how this can be used. Even the simple example at the beginning looked like Greek to me.

Are the Arrow Loop Laws (Figure 3) enforced by their optimizer/compiler?

I got a bit confused by all the various passes that done on testing code. How do each of these work and how does the experimental procedure account for these effects?

Are there cases where the optimizations they perform might make the resulting code slower instead of faster?

Why aren't monads and applicative functors general enough?
what is stream fusion?

Could you explain it, "second f = arr swap >>> first f >>> arr swap where swap (a, b) = (b, a)" under the figure 1?

Please register or sign in to post a comment