Introduction to digital logic

From CPUdev wiki
Revision as of 11:26, 15 June 2025 by Immibis (talk | contribs) (first draft)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Boolean algebra (after George Boole) is the mathematical study of statements that are true or false. Several decades later, Claude Shannon realized the same ideas could be used to build electronic circuits to compute things in binary.

Boolean algebra is similar to ordinary algebra: we can write down equations and solve them (or prove they are right or wrong). However, instead of numbers, Boolean equations work on the "numbers" ⊤ (true) and ⊥ or F (false). In binary these are also written as 1 and 0, and we'll use that because this is a wiki about calculating things in binary. Instead of +, -, ×, ÷ and so on, the things you can do with the numbers are ∧, ∨ and ¬. Let's take a look at them.

∧ is the "AND" operator (or, by its more fancy name, "conjunction"). 1∧1=1, but 0∧1=0, 1∧0=0 and 0∧0=0. A AND B is true only if A is true and B is true. Makes sense, right? Since the sides are only true or false, there are only 4 ways it can be, and the rule should be easy to remember - there's no need to memorize AND tables like how you had to memorize times tables at school. You have to remember the symbol: ∧ looks a bit like A for AND.

∨ is the "OR" operator (fancy name: "disjunction"). It's true if either side is true, even if both sides are true. So 1∨1=1, 1∨0=1, 0∨1=1, and 0∨0=0. To remember the symbol: it's the opposite of AND.

¬ is the "NOT" operator (fancy name: "complement" or "inversion" or "negation"). It's true if the thing that comes after it is false. ¬0=1 and ¬1=0.

Also common in digital logic, more than mathematical logic, is the ⊕ "XOR" operator. It's true if one side is true but not both.

So if x and y have different values, here's what each combination would give you:

x y x ∧ y
(AND)
x ∨ y
(OR)
x ⊕ y
(XOR)
0 0 0 0 0
1 0 0 1 1
0 1 0 1 1
1 1 1 1 0
x ¬x
(NOT)
0 1
1 0

Just like normal maths, we can write more complicated equations and solve them. For example, 1∧((0∧1)∨¬(1∧¬1)) is 1.

Like in normal algebra, we use letters to stand for things we don't know. So if we have a∧b we can't tell if it's 0 or 1 until we know whether a stands for 0 or 1, and whether b stands for 0 or 1. But we can tell that a∧0 is always 0, no matter what a stands for, and we can simplify equations like this. Even if a stands for another really complicated equation, we know that a∧0 is 0.

If we see ((a∧b)∨(c∧¬d∧0)) we know it's the same as just a∧b since the second part is always 0, since x∧0 is 0 no matter what x is, and then x∨0 is x no matter what x is.

(TODO: include a list of simplifying rules without just copying Wikipedia?)

Digital logic lets us build machines that calculate Boolean equations. Each operator becomes a little circuit, which is called a *logic gate*, and we can wire them together to calculate complicated equations. We don't draw how each gate is made, but just which ones are used and how they're wired together. Each Boolean operator also has a logic gate symbol. We can make a logic circuit so that if we tell it what a, b, c and d stand for, it will tell us what ((a∧b)∨¬(¬a∧c∧¬d)) is.

(insert a diagram, or even better, an interactive simulation, of a circuit that does this)

We can make circuits that compute useful things. We can make a simple calculator - if we have a number a and a number b, that are 0, 1 or 2, we can write a set of equations that calculates the answer, and make a machine that calculates them.

out0 = a0 ∧ b0
out1 = (a1 ∧ b0) ∨ (a0 ∧ b1)
out2 = (a2 ∧ b0) ∨ (a1 ∧ b1) ∨ (a0 ∧ b2)
etc...

(insert a diagram, or even better, an interactive simulation, of a circuit that does this)

Of course, as you'll see later on, it's better to calculate with binary numbers.