Difference between revisions of "Directory:Jon Awbrey/Papers/Differential Logic : Introduction"
Jon Awbrey (talk | contribs) |
Jon Awbrey (talk | contribs) (→Cactus Language for Propositional Logic: try alternate table without graphics) |
||
Line 34: | Line 34: | ||
Table 1 collects a sample of basic propositional forms as expressed in terms of cactus language connectives. | Table 1 collects a sample of basic propositional forms as expressed in terms of cactus language connectives. | ||
− | {| align="center" border="1" cellpadding="8" cellspacing="0" style="text-align:center; width: | + | <br> |
− | |+ <math>\text{Table 1.}~~\text{Syntax and Semantics of a Calculus for Propositional Logic}\!</math> | + | |
− | |- style="background: | + | {| align="center" border="1" cellpadding="8" cellspacing="0" style="text-align:center; width:75%" |
− | + | |+ style="height:30px" | <math>\text{Table 1.} ~~ \text{Syntax and Semantics of a Calculus for Propositional Logic}\!</math> | |
+ | |- style="height:40px; background:ghostwhite" | ||
| <math>\text{Expression}~\!</math> | | <math>\text{Expression}~\!</math> | ||
| <math>\text{Interpretation}\!</math> | | <math>\text{Interpretation}\!</math> | ||
| <math>\text{Other Notations}\!</math> | | <math>\text{Other Notations}\!</math> | ||
|- | |- | ||
− | | | + | | |
− | + | | <math>\text{True}\!</math> | |
− | | <math>\ | ||
| <math>1\!</math> | | <math>1\!</math> | ||
|- | |- | ||
− | |||
| <math>\texttt{(~)}\!</math> | | <math>\texttt{(~)}\!</math> | ||
− | | <math>\ | + | | <math>\text{False}\!</math> |
| <math>0\!</math> | | <math>0\!</math> | ||
|- | |- | ||
− | + | | <math>x\!</math> | |
− | | <math> | + | | <math>x\!</math> |
− | | <math> | + | | <math>x\!</math> |
− | | <math> | ||
|- | |- | ||
− | + | | <math>\texttt{(} x \texttt{)}\!</math> | |
− | | <math>\texttt{(} | + | | <math>\text{Not}~ x\!</math> |
− | | <math>\ | + | | |
− | | <math>\ | + | <math>\begin{matrix} |
+ | x' | ||
+ | \\ | ||
+ | \tilde{x} | ||
+ | \\ | ||
+ | \lnot x | ||
+ | \end{matrix}\!</math> | ||
|- | |- | ||
− | + | | <math>x~y~z\!</math> | |
− | | <math> | + | | <math>x ~\text{and}~ y ~\text{and}~ z\!</math> |
− | | <math> | + | | <math>x \land y \land z\!</math> |
− | | <math> | ||
|- | |- | ||
− | + | | <math>\texttt{((} x \texttt{)(} y \texttt{)(} z \texttt{))}\!</math> | |
− | | <math>\texttt{((} | + | | <math>x ~\text{or}~ y ~\text{or}~ z\!</math> |
− | | <math> | + | | <math>x \lor y \lor z\!</math> |
− | | <math> | ||
|- | |- | ||
− | + | | <math>\texttt{(} x ~ \texttt{(} y \texttt{))}\!</math> | |
− | | <math>\texttt{(} | ||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | + | x ~\text{implies}~ y | |
− | \\ | + | \\ |
− | \mathrm{ | + | \mathrm{If}~ x ~\text{then}~ y |
− | \end{matrix} | + | \end{matrix}</math> |
− | | <math> | + | | <math>x \Rightarrow y\!</math> |
|- | |- | ||
− | + | | <math>\texttt{(} x \texttt{,} y \texttt{)}\!</math> | |
− | | <math>\texttt{(} | ||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | + | x ~\text{not equal to}~ y | |
− | \\ | + | \\ |
− | + | x ~\text{exclusive or}~ y | |
− | \end{matrix} | + | \end{matrix}</math> |
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | + | x \ne y | |
− | \\ | + | \\ |
− | + | x + y | |
− | \end{matrix} | + | \end{matrix}</math> |
|- | |- | ||
− | + | | <math>\texttt{((} x \texttt{,} y \texttt{))}\!</math> | |
− | | <math>\texttt{((} | ||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | + | x ~\text{is equal to}~ y | |
− | \\ | + | \\ |
− | + | x ~\text{if and only if}~ y | |
− | \end{matrix} | + | \end{matrix}</math> |
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | + | x = y | |
− | \\ | + | \\ |
− | + | x \Leftrightarrow y | |
− | \end{matrix} | + | \end{matrix}</math> |
|- | |- | ||
− | + | | <math>\texttt{(} x \texttt{,} y \texttt{,} z \texttt{)}\!</math> | |
− | | <math>\texttt{(} | ||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | \ | + | \text{Just one of} |
\\ | \\ | ||
− | + | x, y, z | |
\\ | \\ | ||
− | \ | + | \text{is false}. |
− | \end{matrix} | + | \end{matrix}</math> |
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | & \ | + | x'y~z~ & \lor |
\\ | \\ | ||
− | \lor | + | x~y'z~ & \lor |
\\ | \\ | ||
− | + | x~y~z' & | |
− | \end{matrix} | + | \end{matrix}</math> |
|- | |- | ||
− | + | | <math>\texttt{((} x \texttt{),(} y \texttt{),(} z \texttt{))}\!</math> | |
− | | <math>\texttt{((} | ||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | \ | + | \text{Just one of} |
+ | \\ | ||
+ | x, y, z | ||
+ | \\ | ||
+ | \text{is true}. | ||
+ | \\ | ||
+ | & | ||
+ | \\ | ||
+ | \text{Partition all} | ||
\\ | \\ | ||
− | + | \text{into}~ x, y, z. | |
+ | \end{matrix}</math> | ||
+ | | | ||
+ | <math>\begin{matrix} | ||
+ | x~y'z' & \lor | ||
\\ | \\ | ||
− | + | x'y~z' & \lor | |
− | |||
− | \ | ||
\\ | \\ | ||
− | + | x'y'z~ & | |
− | \end{matrix} | + | \end{matrix}</math> |
+ | |- | ||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | + | \texttt{((} x \texttt{,} y \texttt{),} z \texttt{)} | |
\\ | \\ | ||
− | + | & | |
\\ | \\ | ||
− | \ | + | \texttt{(} x \texttt{,(} y \texttt{,} z \texttt{))} |
\end{matrix}\!</math> | \end{matrix}\!</math> | ||
− | |||
− | |||
− | |||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | \ | + | \text{Oddly many of} |
\\ | \\ | ||
− | + | x, y, z | |
\\ | \\ | ||
− | \ | + | \text{are true}. |
\end{matrix}\!</math> | \end{matrix}\!</math> | ||
| | | | ||
− | <p><math> | + | <p><math>x + y + z\!</math></p> |
<br> | <br> | ||
<p><math>\begin{matrix} | <p><math>\begin{matrix} | ||
− | + | x~y~z~ & \lor | |
\\ | \\ | ||
− | \lor | + | x~y'z' & \lor |
\\ | \\ | ||
− | \lor | + | x'y~z' & \lor |
\\ | \\ | ||
− | + | x'y'z~ & | |
\end{matrix}\!</math></p> | \end{matrix}\!</math></p> | ||
|- | |- | ||
− | + | | <math>\texttt{(} w \texttt{,(} x \texttt{),(} y \texttt{),(} z \texttt{))}\!</math> | |
− | | <math>\texttt{(} | ||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | \ | + | \text{Partition}~ w |
\\ | \\ | ||
− | \ | + | \text{into}~ x, y, z. |
− | |||
− | |||
\\ | \\ | ||
− | \ | + | & |
− | \end{matrix} | + | \\ |
+ | \text{Genus}~ w ~\text{comprises} | ||
+ | \\ | ||
+ | \text{species}~ x, y, z. | ||
+ | \end{matrix}</math> | ||
| | | | ||
<math>\begin{matrix} | <math>\begin{matrix} | ||
− | & \ | + | w'x'y'z' & \lor |
\\ | \\ | ||
− | + | w~x~y'z' & \lor | |
\\ | \\ | ||
− | + | w~x'y~z' & \lor | |
\\ | \\ | ||
− | + | w~x'y'z~ & | |
− | \end{matrix} | + | \end{matrix}</math> |
|} | |} | ||
+ | |||
+ | <br> | ||
The simplest expression for logical truth is the empty word, usually denoted by <math>\boldsymbol\varepsilon\!</math> or <math>\lambda\!</math> in formal languages, where it forms the identity element for concatenation. To make it visible in context, it may be denoted by the equivalent expression <math>{}^{\backprime\backprime} \texttt{((~))} {}^{\prime\prime},\!</math> or, especially if operating in an algebraic context, by a simple <math>{}^{\backprime\backprime} 1 {}^{\prime\prime}.\!</math> Also when working in an algebraic mode, the plus sign <math>{}^{\backprime\backprime} + {}^{\prime\prime}\!</math> may be used for [[exclusive disjunction]]. For example, we have the following paraphrases of algebraic expressions by means of parenthesized expressions: | The simplest expression for logical truth is the empty word, usually denoted by <math>\boldsymbol\varepsilon\!</math> or <math>\lambda\!</math> in formal languages, where it forms the identity element for concatenation. To make it visible in context, it may be denoted by the equivalent expression <math>{}^{\backprime\backprime} \texttt{((~))} {}^{\prime\prime},\!</math> or, especially if operating in an algebraic context, by a simple <math>{}^{\backprime\backprime} 1 {}^{\prime\prime}.\!</math> Also when working in an algebraic mode, the plus sign <math>{}^{\backprime\backprime} + {}^{\prime\prime}\!</math> may be used for [[exclusive disjunction]]. For example, we have the following paraphrases of algebraic expressions by means of parenthesized expressions: |
Revision as of 16:38, 29 November 2015
Author: Jon Awbrey
Differential logic is the component of logic whose object is the description of variation — for example, the aspects of change, difference, distribution, and diversity — in universes of discourse that are subject to logical description. A definition that broad naturally incorporates any study of variation by way of mathematical models, but differential logic is especially charged with the qualitative aspects of variation that pervade or precede quantitative models. To the extent that a logical inquiry makes use of a formal system, its differential component treats the principles that govern the use of a differential logical calculus, that is, a formal system with the expressive capacity to describe change and diversity in a logical universe of discourse.
A simple example of a differential logical calculus is furnished by a differential propositional calculus. A differential propositional calculus is a propositional calculus extended by a set of terms for describing aspects of change and difference, for example, processes that take place in a universe of discourse or transformations that map a source universe into a target universe. This augments ordinary propositional calculus in the same way that the differential calculus of Leibniz and Newton augments the analytic geometry of Descartes.
Quick Overview
Cactus Language for Propositional Logic
The development of differential logic is greatly facilitated by having a conceptually efficient calculus in place at the level of boolean-valued functions and elementary logical propositions. A calculus that is very efficient from both conceptual and computational standpoints is based on just two types of logical connectives, both of variable -ary scope. The formulas of this calculus map into a species of graph-theoretical structures called painted and rooted cacti (PARCs) that lend visual representation to their functional structure and smooth the path to efficient computation.
The first kind of propositional expression is a parenthesized sequence of propositional expressions, written as \texttt{(} e_1 \texttt{,} e_2 \texttt{,} \ldots \texttt{,} e_{k-1} \texttt{,} e_k \texttt{)}\! and read to say that exactly one of the propositions e_1, e_2, \ldots, e_{k-1}, e_k\! is false, in other words, that their minimal negation is true. A clause of this form maps into a PARC structure called a lobe, in this case, one that is painted with the colors e_1, e_2, \ldots, e_{k-1}, e_k\! as shown below.
|
![]() |
The second kind of propositional expression is a concatenated sequence of propositional expressions, written as e_1\ e_2\ \ldots\ e_{k-1}\ e_k\! and read to say that all of the propositions e_1, e_2, \ldots, e_{k-1}, e_k\! are true, in other words, that their logical conjunction is true. A clause of this form maps into a PARC structure called a node, in this case, one that is painted with the colors e_1, e_2, \ldots, e_{k-1}, e_k\! as shown below. |
![]() |
All other propositional connectives can be obtained through combinations of these two forms. Strictly speaking, the parenthesized form is sufficient to define the concatenated form, making the latter formally dispensable, but it is convenient to maintain it as a concise way of expressing more complicated combinations of parenthesized forms. While working with expressions solely in propositional calculus, it is easiest to use plain parentheses for logical connectives. In contexts where ordinary parentheses are needed for other purposes an alternate typeface \texttt{(} \ldots \texttt{)}\! may be used for logical operators.
Table 1 collects a sample of basic propositional forms as expressed in terms of cactus language connectives.
\text{Expression}~\! | \text{Interpretation}\! | \text{Other Notations}\! |
\text{True}\! | 1\! | |
\texttt{(~)}\! | \text{False}\! | 0\! |
x\! | x\! | x\! |
\texttt{(} x \texttt{)}\! | \text{Not}~ x\! |
\begin{matrix} x' \\ \tilde{x} \\ \lnot x \end{matrix}\! |
x~y~z\! | x ~\text{and}~ y ~\text{and}~ z\! | x \land y \land z\! |
\texttt{((} x \texttt{)(} y \texttt{)(} z \texttt{))}\! | x ~\text{or}~ y ~\text{or}~ z\! | x \lor y \lor z\! |
\texttt{(} x ~ \texttt{(} y \texttt{))}\! |
\begin{matrix} x ~\text{implies}~ y \\ \mathrm{If}~ x ~\text{then}~ y \end{matrix} |
x \Rightarrow y\! |
\texttt{(} x \texttt{,} y \texttt{)}\! |
\begin{matrix} x ~\text{not equal to}~ y \\ x ~\text{exclusive or}~ y \end{matrix} |
\begin{matrix} x \ne y \\ x + y \end{matrix} |
\texttt{((} x \texttt{,} y \texttt{))}\! |
\begin{matrix} x ~\text{is equal to}~ y \\ x ~\text{if and only if}~ y \end{matrix} |
\begin{matrix} x = y \\ x \Leftrightarrow y \end{matrix} |
\texttt{(} x \texttt{,} y \texttt{,} z \texttt{)}\! |
\begin{matrix} \text{Just one of} \\ x, y, z \\ \text{is false}. \end{matrix} |
\begin{matrix} x'y~z~ & \lor \\ x~y'z~ & \lor \\ x~y~z' & \end{matrix} |
\texttt{((} x \texttt{),(} y \texttt{),(} z \texttt{))}\! |
\begin{matrix} \text{Just one of} \\ x, y, z \\ \text{is true}. \\ & \\ \text{Partition all} \\ \text{into}~ x, y, z. \end{matrix} |
\begin{matrix} x~y'z' & \lor \\ x'y~z' & \lor \\ x'y'z~ & \end{matrix} |
\begin{matrix} \texttt{((} x \texttt{,} y \texttt{),} z \texttt{)} \\ & \\ \texttt{(} x \texttt{,(} y \texttt{,} z \texttt{))} \end{matrix}\! |
\begin{matrix} \text{Oddly many of} \\ x, y, z \\ \text{are true}. \end{matrix}\! |
x + y + z\!
\begin{matrix} x~y~z~ & \lor \\ x~y'z' & \lor \\ x'y~z' & \lor \\ x'y'z~ & \end{matrix}\! |
\texttt{(} w \texttt{,(} x \texttt{),(} y \texttt{),(} z \texttt{))}\! |
\begin{matrix} \text{Partition}~ w \\ \text{into}~ x, y, z. \\ & \\ \text{Genus}~ w ~\text{comprises} \\ \text{species}~ x, y, z. \end{matrix} |
\begin{matrix} w'x'y'z' & \lor \\ w~x~y'z' & \lor \\ w~x'y~z' & \lor \\ w~x'y'z~ & \end{matrix} |
The simplest expression for logical truth is the empty word, usually denoted by \boldsymbol\varepsilon\! or \lambda\! in formal languages, where it forms the identity element for concatenation. To make it visible in context, it may be denoted by the equivalent expression {}^{\backprime\backprime} \texttt{((~))} {}^{\prime\prime},\! or, especially if operating in an algebraic context, by a simple {}^{\backprime\backprime} 1 {}^{\prime\prime}.\! Also when working in an algebraic mode, the plus sign {}^{\backprime\backprime} + {}^{\prime\prime}\! may be used for exclusive disjunction. For example, we have the following paraphrases of algebraic expressions by means of parenthesized expressions:
\begin{matrix} a + b & = & \texttt{(} a \texttt{,} b \texttt{)} \end{matrix}\! |
\begin{matrix} a + b + c & = & \texttt{(} a \texttt{,(} b \texttt{,} c \texttt{))} & = & \texttt{((} a \texttt{,} b \texttt{),} c \texttt{)} \end{matrix}\! |
It is important to note that the last expressions are not equivalent to the 3-place parenthesis \texttt{(} a \texttt{,} b \texttt{,} c \texttt{)}.\!
Differential Expansions of Propositions
Bird's Eye View
An efficient calculus for the realm of logic represented by boolean functions and elementary propositions makes it feasible to compute the finite differences and the differentials of those functions and propositions.
For example, consider a proposition of the form {}^{\backprime\backprime} \, p ~\mathrm{and}~ q \, {}^{\prime\prime}\! that is graphed as two letters attached to a root node:
![]() |
Written as a string, this is just the concatenation p~q\!.
The proposition pq\! may be taken as a boolean function f(p, q)\! having the abstract type f : \mathbb{B} \times \mathbb{B} \to \mathbb{B},\! where \mathbb{B} = \{ 0, 1 \}~\! is read in such a way that 0\! means \mathrm{false}\! and 1\! means \mathrm{true}.\!
Imagine yourself standing in a fixed cell of the corresponding venn diagram, say, the cell where the proposition pq\! is true, as shown in the following Figure:
![]() |
Now ask yourself: What is the value of the proposition pq\! at a distance of \mathrm{d}p\! and \mathrm{d}q\! from the cell pq\! where you are standing?
Don't think about it — just compute:
![]() |
The cactus formula \texttt{(p, dp)(q, dq)}\! and its corresponding graph arise by substituting p + \mathrm{d}p\! for p\! and q + \mathrm{d}q\! for q\! in the boolean product or logical conjunction pq\! and writing the result in the two dialects of cactus syntax. This follows from the fact that the boolean sum p + \mathrm{d}p\! is equivalent to the logical operation of exclusive disjunction, which parses to a cactus graph of the following form:
![]() |
Next question: What is the difference between the value of the proposition pq\! over there, at a distance of \mathrm{d}p\! and \mathrm{d}q,\! and the value of the proposition pq\! where you are standing, all expressed in the form of a general formula, of course? Here is the appropriate formulation:
![]() |
There is one thing that I ought to mention at this point: Computed over \mathbb{B},\! plus and minus are identical operations. This will make the relation between the differential and the integral parts of the appropriate calculus slightly stranger than usual, but we will get into that later.
Last question, for now: What is the value of this expression from your current standpoint, that is, evaluated at the point where pq\! is true? Well, substituting 1\! for p\! and 1\! for q\! in the graph amounts to erasing the labels p\! and q\!,\! as shown here:
![]() |
And this is equivalent to the following graph:
![]() |
We have just met with the fact that the differential of the and is the or of the differentials.
\begin{matrix} p ~\mathrm{and}~ q & \quad & \xrightarrow{\quad\mathrm{Diff}\quad} & \quad & \mathrm{d}p ~\mathrm{or}~ \mathrm{d}q \end{matrix}\! |
![]() |
It will be necessary to develop a more refined analysis of that statement directly, but that is roughly the nub of it.
If the form of the above statement reminds you of De Morgan's rule, it is no accident, as differentiation and negation turn out to be closely related operations. Indeed, one can find discussions of logical difference calculus in the Boole–De Morgan correspondence and Peirce also made use of differential operators in a logical context, but the exploration of these ideas has been hampered by a number of factors, not the least of which has been the lack of a syntax that was adequate to handle the complexity of expressions that evolve.
Worm's Eye View
Let's run through the initial example again, this time attempting to interpret the formulas that develop at each stage along the way. We begin with a proposition or a boolean function f(p, q) = pq.\!
![]() |
![]() |
A function like this has an abstract type and a concrete type. The abstract type is what we invoke when we write things like f : \mathbb{B} \times \mathbb{B} \to \mathbb{B}\! or f : \mathbb{B}^2 \to \mathbb{B}.\! The concrete type takes into account the qualitative dimensions or the “units” of the case, which can be explained as follows.
Let P\! be the set of values \{ \texttt{(} p \texttt{)},~ p \} ~=~ \{ \mathrm{not}~ p,~ p \} ~\cong~ \mathbb{B}.\! |
Let Q\! be the set of values \{ \texttt{(} q \texttt{)},~ q \} ~=~ \{ \mathrm{not}~ q,~ q \} ~\cong~ \mathbb{B}.\! |
Then interpret the usual propositions about p, q\! as functions of the concrete type f : P \times Q \to \mathbb{B}.\!
We are going to consider various operators on these functions. Here, an operator \mathrm{F}\! is a function that takes one function f\! into another function \mathrm{F}f.\!
The first couple of operators that we need to consider are logical analogues of the pair that play a founding role in the classical finite difference calculus, namely:
The difference operator \Delta,\! written here as \mathrm{D}.\! |
The enlargement operator \Epsilon,\! written here as \mathrm{E}.\! |
These days, \mathrm{E}\! is more often called the shift operator.
In order to describe the universe in which these operators operate, it is necessary to enlarge the original universe of discourse. Starting from the initial space X = P \times Q,\! its (first order) differential extension \mathrm{E}X\! is constructed according to the following specifications:
\begin{array}{rcc} \mathrm{E}X & = & X \times \mathrm{d}X \end{array}\! |
where:
\begin{array}{rcc} X & = & P \times Q \'"`UNIQ-MathJax1-QINU`"' Amazing! {| align="center" cellpadding="0" cellspacing="0" width="90%" | <p>Consider what effects that might ''conceivably'' have practical bearings you ''conceive'' the objects of your ''conception'' to have. Then, your ''conception'' of those effects is the whole of your ''conception'' of the object.</p> |- | align="right" | — Charles Sanders Peirce, “Issues of Pragmaticism”, (CP 5.438) |} One other subject that it would be opportune to mention at this point, while we have an object example of a mathematical group fresh in mind, is the relationship between the pragmatic maxim and what are commonly known in mathematics as ''representation principles''. As it turns out, with regard to its formal characteristics, the pragmatic maxim unites the aspects of a representation principle with the attributes of what would ordinarily be known as a ''closure principle''. We will consider the form of closure that is invoked by the pragmatic maxim on another occasion, focusing here and now on the topic of group representations. Let us return to the example of the ''four-group'' \(V_4.\! We encountered this group in one of its concrete representations, namely, as a transformation group that acts on a set of objects, in this case a set of sixteen functions or propositions. Forgetting about the set of objects that the group transforms among themselves, we may take the abstract view of the group's operational structure, for example, in the form of the group operation table copied here:
This table is abstractly the same as, or isomorphic to, the versions with the \mathrm{E}_{ij}\! operators and the \mathrm{T}_{ij}\! transformations that we took up earlier. That is to say, the story is the same, only the names have been changed. An abstract group can have a variety of significantly and superficially different representations. But even after we have long forgotten the details of any particular representation there is a type of concrete representations, called regular representations, that are always readily available, as they can be generated from the mere data of the abstract operation table itself. To see how a regular representation is constructed from the abstract operation table, select a group element from the top margin of the Table, and “consider its effects” on each of the group elements as they are listed along the left margin. We may record these effects as Peirce usually did, as a logical aggregate of elementary dyadic relatives, that is, as a logical disjunction or boolean sum whose terms represent the ordered pairs of \mathrm{input} : \mathrm{output}\! transactions that are produced by each group element in turn. This forms one of the two possible regular representations of the group, in this case the one that is called the post-regular representation or the right regular representation. It has long been conventional to organize the terms of this logical aggregate in the form of a matrix: Reading “+\!” as a logical disjunction:
And so, by expanding effects, we get:
|
- Artificial Intelligence
- Boolean Algebra
- Boolean Functions
- Charles Sanders Peirce
- Combinatorics
- Computational Complexity
- Computer Science
- Cybernetics
- Differential Logic
- Equational Reasoning
- Formal Languages
- Formal Systems
- Graph Theory
- Inquiry
- Inquiry Driven Systems
- Knowledge Representation
- Logic
- Logical Graphs
- Mathematics
- Philosophy
- Propositional Calculus
- Semiotics
- Visualization