Formal Logic

Code For Cash

Formal logic is the study of statements or propositions and deductive arguments. It removes the confusion of language to focus on the application of reason. To do this formal logic abstracts the content and replaces it with a symbolic notation.

Formal logic is a priori meaning that it does not rely on observations for data. In this regard it is more analogous to mathematics than other sciences. The process of reasoning or the psychology of reasoning are separate from formal logic. It is also different from the art of correct reasoning since that is the application of logical principles to particular cases. Formal logic serves as the foundation for organized reasoning such as scientific studies or criminal investigations. Circuit logic, the logic of computers, is a direct analog of formal logic.

This is just an overview, or refresher for some, of the topics of formal logic. It has many direct applications to computer science and programming. Many times developers have to not only reason through confusing data but write code that tells the computer how to draw conclusions from data input by users. For more in depth details check out the book mentioned earlier.

Episode Breakdown

A statement, or proposition, is a sentence with a truth value.

It can either be true or false. ‘Some apples are red’ is a statement because it can be verified as true or false. ‘Blue is the best color’ is not a statement because it is an opinion.

Statements are represented by capital letters such as A, B, or C. These could represent literal statements such as in language. They could also represent mathematical statements that can be proven true or false.

They are used as the basis for logical comparisons. Formal logic compares the truth value of statements to each other. What the statement actually says is irrelevant in logic. Only it’s truth value is of value.

Logical connectives are used to compare the truth values of statements.

Logical conjunction compares truth values using boolean and. For a conjunction of A, B to be true both A and B must be true. The statements A and B are called conjuncts. It is symbolized as triangle with the base removed: /\.

Logical disjunction is the same as boolean or. For a disjunction of A, B to be true either A or B must be true. The statements A and B are called disjuncts. It is symbolized as an inverted triangle with the base removed: \/.

Statements combined to state “if A, then B” are logical implications. They are read as A implies B. Logical implications are symbolized by an arrow pointing from A to B: A -> B. In them the truth of A leads to the truth of B. If A is true and B is false the implication is false. If A is false then it doesn’t matter the value of B so it’s true.

Equivalence is not really connective but rather a shortcut for a more complicated set of connectives. It represents (A -> B)/\(B -> A) or A implies B and B implies A. It is true only when A and B have the same truth value.

A unary connection commonly used is negation. It returns the opposite of the boolean value. Negation is symbolized as A’.

Truth tables represent all possible boolean values of statements and connectives.

Each column of the truth table represents a statement or connective. It is simplest to start with the statements. Then if you have complex connectives break them down into simpler ones. Finally combine the simpler connectives. This simplifies the process of comparing complex statements into comparing the boolean values of simpler statements.

Each row represents a set of boolean values for each statement and the resulting comparison. First, to compare two statements the following boolean values are required: (True,True), (True,False), (False,True), and (False,False). Then using compare the values using the connectives in the columns for them.

You will have 2^n number of rows where n is the number of statements. Basically it grows exponentially as the number of statements increases. So for the smallest comparison of 2 statements you’ll need a truth table with 4 rows.

Statements, connectives, and parentheses can be strung together to create expressions.

Expressions that are legitimate strings are called a well-formed formula or wff. This applies the rules of syntax or sentence structure to expressions. These can be statements or expressions of statements and connectives. Wffs have their own truth value and can themselves be used as statements.

An order of operations is used to reduce parentheses in programming.

  • 1. Parentheses with inner most first.
  • 2. Negation.
  • 3. Conjunction.
  • 4. Disjunction.
  • 5. Implication.
  • 6. Equivalence

A tautology is a wff that is intrinsically true no matter the boolean values of the statements.

The simplest example is A \/ A’. It is always true no matter the value of A.

A contradiction is the opposite of a tautology. It is always false. A /\ A’. It will always be false no matter the value of A.

One of the most well known tautologies is De Morgan’s Law. It states that (A \/ B)’ &lt=> A’ /\ B’. The negative of A OR B is the same as the negative of A AND the negative of B. This is a core piece of boolean logic.

Equivalence rules state that certain pairs of wffs are equivalent or return the same truth value for a given input.

The commutative rule states that the order of statements does not matter when conjoining or disjoining two values. A /\ B is the same as B /\ A. A \/ B is the same as B \/ A.

The associative rule states that when conjoining or disjoining three or more statements the order does not matter. (A /\ B) /\ C is the same as A /\ (B /\ C). (A \/ B) \/ C is the same as A \/ (B \/ C).

De Morgan’s Law states that the negation of the conjunction or disjunction is the same as negating the statements and applying the opposite connective. (A /\ B)’ is the same as A’ \/ B’. (A \/ B)’ is the same as A’ /\ B’. Notice that the conjunction becomes a disjunctions and vice versa.

The implication rule states that A implies B is the same as the negation of A disjoined with B. A -> B is the same as A’ \/ B. A’ -> B is the same as A \/ B. This is very useful for applying an inference with code using an OR statement.

The definition of equivalence is also an equivalence rule. This was stated earlier as a shortcut for connectives. In reality it is an equivalence rule or more directly the definition of equivalence.

Inference rules state that if you have certain wffs and/or statements that you can derive others from them.

Modus ponens states that if you know a statement and know that statement implies another statement then you can derive the second statement. A, A -> B gives you B. You can prove this by using a truth table. If you know the values for A and A -> B then you can derive B from the truth table.

Modus tollens states that if you have a statement that implies another statement and you know the negation of second statement you can infer the negation of first statement. A -> B, B’ gives you A’. This can get confusing when there are negations in the implication and you have to use double negation.

Conjunction states that if you know two statements you know their conjunction. A, B gives you A /\ B. You can create a truth table to prove this.

Simplification states that if you have a conjunction you can get the statements. A /\ B gives you A, B. Makes sense if they are true, not sure how it works if false since A or B or both could be false. If A /\ B is true then we know A and B are both true but that’s not the case if it’s false.

Addition states that if you know a statement then you can infer it’s disjunction with another statement. A gives you A \/ B. This is another that works if A is true but not so much if it’s false since B could be anything.

Propositional logic is the use of propositions or statements along with formal logic to reach a conclusion.

The propositions are called the hypotheses. They are composed of statements which could be a simple statement or the product of a previous comparison. When inferring a conclusion the hypotheses are assumed to be known.

The propositions lead to a conclusion based on their boolean values. Use a truth table to show all possible combinations of propositions. From there infer the conclusion based on the truth values of the table.

This can be represented as a series of propositions conjoined that imply a conclusion. P(1) /\ P(2) /\ P(3) /\ … /\ P(n) -> Q. This is called a proof sequence where the P’s are wffs or results of logic applied to earlier wffs. Assuming the boolean values for the wffs in the P’s then you determine when Q is true.

Book Club

Algorithms to Live By: The Computer Science of Human Decisions

Brian Christian and Tom Griffiths

Each chapter in this book looks at a different algorithm or problem that algorithms were designed to solve. Chapter one looks at optimal stopping. In it the authors discuss the secretary problem of how to know when to stop looking for a new secretary with the limitation that once one has been interviewed you have to make a choice or they will not come back. They look at how this problem and it’s solutions apply to areas of our lives from looking for a new apartment to dating. Chapter two explores the explore/exploit problem then chapter three gets into sorting. They spend some time talking about Big O notation and how sorting does not always improve searching.

Tricks of the Trade

Programming is vs. Programming does. What programming is is boring once you have mastery. Once you start talking about what programming DOES, then you can maintain your sense of wonder with the whole thing. That is how you survive for 20 years.

Tagged with: , , , , , , ,

Leave a Reply