An analogy for multiplicative disjunction
Listen in while I have an actual insight on air! After a week of head scratching, I figure out -- while chattering! -- what I hope is both a correct and intuitive example of multiplicative disjunction.
Listen in while I have an actual insight on air! After a week of head scratching, I figure out -- while chattering! -- what I hope is both a correct and intuitive example of multiplicative disjunction.
I explain the basic idea of multiplicative versus additive proof rules, and consider multiplicative conjunction (tensor), addition conjunction (&, "with"), additive disjunction -- but leaving multiplicative disjunction (par) for next time!
We discuss briefly the central ideas of linear logic, where by default assumptions must be used exactly once.
In this episode I discuss the basic structural rules of weakening, contraction, and exchange, and speculate on their dark origin.
Cut elimination for sequent calculus is more involved that normalization of detours for natural deduction. There are more cases of cuts that must be transformed than correspond to detours (introductions followed by eliminations). In this episode, I explain why that is.
We saw in the last few episodes that proofs in natural deduction can be simplified by removing detours, which occur when an introduction inference is immediately followed by an elimination inference on the introduced formula. What corresponds to this for sequent calculus proofs? The answer is cut elimination. This episode describes the cut rule and what is meant by a cut-elimination procedure. We will talk more about such a procedure in the next episode.
We talk about normalizing detours -- which are when an introduction inference is immediately followed by an elimination inference -- for the implication rules. Under Curry-Howard, this actually corresponds to beta-reduction, and could make the proof bigger (though less complex in a certain sense).
This episode explains the idea of normalization of proofs in natural deduction. We want to eliminate so-called detours in proofs, which occur when an introduction is immediately followed by an elimination.
Sequent calculus is a different style in which proof systems can be formulated, where for each connective, we have a left rule for introducing it (in the conclusion of the rule) in the left part of a sequent G => D (i.e., in G), and similarly a right rule for introducing it in the right part (D). The beauty of sequent calculus is disjunction is handled without any departure from the general form for sequent calculus rules (unlike disjunction in natural deduction, as we discussed last time).
We discuss the problem of the or-elimination rule in natural deduction, which does not have the correct form for natural deduction inferences. It is a research problem to fix this!
Don't forget to get in touch with me if you want to do the mini-course over Zoom next month, on normalization.
We discuss further inferences in natural deduction, in particular implication introduction and elimination.
This episode begins the discussion of the style of proof known as Natural Deduction, invented by Gerhard Gentzen, a student of Hermann Weyl, himself a student of David Hilbert (sorry, I said incorrectly that Gentzen was Hilbert's own student). Each logical connective (like OR, AND, IMPLIES, etc.) has introduction rules that let you prove formulas built with that connective; and elimination rules that let you deduce consequences from a proven formula built with that connective.
We continue our gradual entry into proof theory by talking about reflecting meta-logical reasoning into logical rules, and naming the three basic proof systems (Hilbert-style, natural deduction, and sequent calculus).
Advertising for the October 3-session Zoom mini-course on normalization continues. Email me if you are interested! This is just for ITTC listeners.
Finally, if you like the podcast and want to support it, would you consider a small ($5 or $10 is great) donation? This is not so much to cover the $12/month hosting fee at Buzzsprout but to get some kudos here at my institution, so people here know that I have a podcast [grin] and that listeners like it. To donate, go here, then choose "Colleges", then "College of Liberal Arts and Sciences", and then tick "Computer Science Development Fund". For "gift details", please leave "gift instructions" that say the gift is for Aaron Stump for the Iowa Type Theory Commute podcast. Then the money will get to the right place for me to use it for this. Thanks a lot in advance!
I highlight two basic points in this continuing warm-up to proof theory: there are different proof systems for the same logics, and it is customary to separate purely logical rules (dealing with propositional connectives and quantifiers, for example) from rules or axioms for some particular domain (like axioms about arithmetic, or whatever domain is of interest).
This is the start of Season 3 of the podcast. We are beginning with a new chapter (Chapter 14) on Proof Theory. This episode gives some basic introduction to the history and broad concerns of proof theory. There is even music, composed and played by yours truly, chez nous...
In this episode I discuss the paper "Modula-2 and Oberon" by Niklaus Wirth. Modula-2 introduced (it seems from the paper) the idea of having modules with explicit import and import lists -- something we saw at the start of our look at module systems with Haskell's module system. I note some interesting historical points raised by the paper.
Analogously to the decomposition of a datatype into a functor (which can be built from other functors to assemble a bigger language from smaller pieces of languages) and a single expression datatype with a sole constructor that builds an Expr from an F Expr (where F is the functor) -- analogously, a recursion can be decomposed into algebras and a fold function that applies the algebra throughout the datatype.
Last episode we discussed how functors can describe a single level of a datatype. In this episode, we discuss how to put these functors back together into a datatype, using disjoint unions of functors and a fixed-point datatype. The latter expresses the idea that inductive data is built in any finite number of layers, where each layer is described by the functor for the datatype.
This episode continues the discussion of Swierstra's paper "Datatypes à la Carte", explaining how we can decompose a datatype into the application of a fixed-point type constructor and then a functor. The functor itself can be assembled from other functors for pieces of the datatype. This makes modular datatypes possible.
In a really wonderful paper of some few years back, Swierstra introduced the idea of modular datatypes using ideas from universal algebra. Modular datatypes allow one to assemble a bigger datatype from component datatypes, and combine functions written on the component datatypes in a modular way. In this episode I introduce the paper and the problem (dubbed the expression problem by Phil Wadler) it is trying to solve. Modular datatypes are a different form of modularity that I would like to consider in the context of the discussion of module systems we have been engaged in now for a while in Chapter 13 of the podcast.
Stay up to date
For any inquiries, please email us at hello@podcastworld.io