Inmathematicsandmathematical logic,Boolean algebrais a branch ofalgebra.It differs fromelementary algebrain two ways. First, the values of thevariablesare thetruth valuestrueandfalse,usually denoted 1 and 0, whereas in elementary algebra the values of the variables are numbers. Second, Boolean algebra useslogical operatorssuch asconjunction(and) denoted as∧,disjunction(or) denoted as∨,andnegation(not) denoted as¬.Elementary algebra, on the other hand, uses arithmetic operators such as addition, multiplication, subtraction, and division. Boolean algebra is therefore a formal way of describinglogical operationsin the same way that elementary algebra describes numerical operations.
Boolean algebra was introduced byGeorge Boolein his first bookThe Mathematical Analysis of Logic(1847),[1]and set forth more fully in hisAn Investigation of the Laws of Thought(1854).[2]According toHuntington,the termBoolean algebrawas first suggested byHenry M. Shefferin 1913,[3]althoughCharles Sanders Peircegave the title "A Boolian [sic] Algebra with One Constant "to the first chapter of his" The Simplest Mathematics "in 1880.[4]Boolean algebra has been fundamental in the development ofdigital electronics,and is provided for in all modernprogramming languages.It is also used inset theoryandstatistics.[5]
History
editA precursor of Boolean algebra wasGottfried Wilhelm Leibniz'salgebra of concepts.The usage of binary in relation to theI Chingwas central to Leibniz'scharacteristica universalis.It eventually created the foundations of algebra of concepts.[6]Leibniz's algebra of concepts is deductively equivalent to the Boolean algebra of sets.[7]
Boole's algebra predated the modern developments inabstract algebraandmathematical logic;it is however seen as connected to the origins of both fields.[8]In an abstract setting, Boolean algebra was perfected in the late 19th century byJevons,Schröder,Huntingtonand others, until it reached the modern conception of an (abstract)mathematical structure.[8]For example, the empirical observation that one can manipulate expressions in thealgebra of sets,by translating them into expressions in Boole's algebra, is explained in modern terms by saying that the algebra of sets isaBoolean algebra(note theindefinite article). In fact,M. H. Stoneproved in 1936that every Boolean algebra isisomorphicto afield of sets.
In the 1930s, while studyingswitching circuits,Claude Shannonobserved that one could also apply the rules of Boole's algebra in this setting,[9]and he introducedswitching algebraas a way to analyze and design circuits by algebraic means in terms oflogic gates.Shannon already had at his disposal the abstract mathematical apparatus, thus he cast his switching algebra as thetwo-element Boolean algebra.In modern circuit engineering settings, there is little need to consider other Boolean algebras, thus "switching algebra" and "Boolean algebra" are often used interchangeably.[10][11][12]
Efficient implementationofBoolean functionsis a fundamental problem in thedesignofcombinational logiccircuits. Modernelectronic design automationtools forvery-large-scale integration(VLSI) circuits often rely on an efficient representation of Boolean functions known as (reduced ordered)binary decision diagrams(BDD) forlogic synthesisandformal verification.[13]
Logic sentences that can be expressed in classicalpropositional calculushave anequivalent expressionin Boolean algebra. Thus,Boolean logicis sometimes used to denote propositional calculus performed in this way.[14][15][16]Boolean algebra is not sufficient to capture logic formulas usingquantifiers,like those fromfirst order logic.
Although the development ofmathematical logicdid not follow Boole's program, the connection between his algebra and logic was later put on firm ground in the setting ofalgebraic logic,which also studies the algebraic systems of many other logics.[8]Theproblem of determining whetherthe variables of a given Boolean (propositional) formula can be assigned in such a way as to make the formula evaluate to true is called theBoolean satisfiability problem(SAT), and is of importance totheoretical computer science,being the first problem shown to beNP-complete.The closely relatedmodel of computationknown as aBoolean circuitrelatestime complexity(of analgorithm) tocircuit complexity.
Values
editWhereas expressions denote mainlynumbersin elementary algebra, in Boolean algebra, they denote thetruth valuesfalseandtrue.These values are represented with thebits,0 and 1. They do not behave like theintegers0 and 1, for which1 + 1 = 2,but may be identified with the elements of thetwo-element fieldGF(2),that is,integer arithmetic modulo 2,for which1 + 1 = 0.Addition and multiplication then play the Boolean roles of XOR (exclusive-or) and AND (conjunction), respectively, with disjunctionx∨y(inclusive-or) definable asx+y−xyand negation¬xas1 −x.InGF(2),−may be replaced by+,since they denote the same operation; however, this way of writing Boolean operations allows applying the usual arithmetic operations of integers (this may be useful when using a programming language in whichGF(2)is not implemented).
Boolean algebra also deals withfunctionswhich have their values in the set{0,1}.Asequence of bitsis a commonly used example of such a function. Another common example is the totality of subsets of a setE:to a subsetFofE,one can define theindicator functionthat takes the value1onF,and0outsideF.The most general example is the set elements of aBoolean algebra,with all of the foregoing being instances thereof.
As with elementary algebra, the purely equational part of the theory may be developed, without considering explicit values for the variables.[17]
Operations
editThis sectionneeds additional citations forverification.(April 2019) |
Basic operations
editWhile Elementary algebra has four operations (addition, subtraction, multiplication, and division), the Boolean algebra has only three basic operations:conjunction,disjunction,andnegation,expressed with the correspondingbinary operatorsAND() and OR () and theunary operatorNOT(), collectively referred to asBoolean operators.[18]Variables in Boolean algebra that store the logical value of 0 and 1 are called theBoolean variables.They are used to store either true or false values.[19]The basic operations on Boolean variablesxandyare defined as follows:
|
Alternatively, the values ofx∧y,x∨y,and ¬xcan be expressed by tabulating their values withtruth tablesas follows:[20]
|
|
When used in expressions, the operators are applied according to the precedence rules. As with elementary algebra, expressions in parentheses are evaluated first, following the precedence rules.[21]
If the truth values 0 and 1 are interpreted as integers, these operations may be expressed with the ordinary operations of arithmetic (wherex+yuses addition andxyuses multiplication), or by the minimum/maximum functions:
One might consider that only negation and one of the two other operations are basic because of the following identities that allow one to define conjunction in terms of negation and the disjunction, and vice versa (De Morgan's laws):[22]
Secondary operations
editOperations composed from the basic operations include, among others, the following:
Material conditional: | |
Material biconditional: | |
Exclusive OR(XOR): |
These definitions give rise to the following truth tables giving the values of these operations for all four possible inputs.
Secondary operations. Table 1 0 0 1 0 1 1 0 0 1 0 0 1 1 1 0 1 1 1 0 1
- Material conditional
- The first operation,x→y,or Cxy,is calledmaterial implication.Ifxis true, then the result of expressionx→yis taken to be that ofy(e.g. ifxis true andyis false, thenx→yis also false). But ifxis false, then the value ofycan be ignored; however, the operation must returnsomeBoolean value and there are only two choices. So by definition,x→yistruewhen x is false. (relevance logicsuggests this definition, by viewing an implication with afalse premiseas something other than either true or false.)
- Exclusive OR(XOR)
- The second operation,x⊕y,or Jxy,is calledexclusive or(often abbreviated as XOR) to distinguish it from disjunction as the inclusive kind. It excludes the possibility of bothxandy beingtrue (e.g. see table): if both are true then result is false. Defined in terms of arithmetic it is addition where mod 2 is 1 + 1 = 0.
- Logical equivalence
- The third operation, the complement of exclusive or, isequivalenceor Boolean equality:x≡y,or Exy,is true just whenxandyhave the same value. Hencex⊕yas its complement can be understood asx≠y,being true just whenxandyare different. Thus, its counterpart in arithmetic mod 2 isx+y.Equivalence's counterpart in arithmetic mod 2 isx+y+ 1.
Laws
editAlawof Boolean algebra is anidentitysuch asx∨ (y∨z) = (x∨y) ∨zbetween two Boolean terms, where aBoolean termis defined as an expression built up from variables and the constants 0 and 1 using the operations ∧, ∨, and ¬. The concept can be extended to terms involving other Boolean operations such as ⊕, →, and ≡, but such extensions are unnecessary for the purposes to which the laws are put. Such purposes include the definition of aBoolean algebraas anymodelof the Boolean laws, and as a means for deriving new laws from old as in the derivation ofx∨ (y∧z) =x∨ (z∧y)fromy∧z=z∧y(as treated in§ Axiomatizing Boolean algebra).
Monotone laws
editBoolean algebra satisfies many of the same laws as ordinary algebra when one matches up ∨ with addition and ∧ with multiplication. In particular the following laws are common to both kinds of algebra:[23][24]
Associativity of∨: Associativity of∧: Commutativity of∨: Commutativity of∧: Distributivity of∧over∨: Identity for∨: Identity for∧: Annihilator for∧:
The following laws hold in Boolean algebra, but not in ordinary algebra:
Annihilator for∨: Idempotence of∨: Idempotence of∧: Absorption 1: Absorption 2: Distributivity of∨over∧:
Takingx= 2in the third law above shows that it is not an ordinary algebra law, since2 × 2 = 4.The remaining five laws can be falsified in ordinary algebra by taking all variables to be 1. For example, in absorption law 1, the left hand side would be1(1 + 1) = 2,while the right hand side would be 1 (and so on).
All of the laws treated thus far have been for conjunction and disjunction. These operations have the property that changing either argument either leaves the output unchanged, or the output changes in the same way as the input. Equivalently, changing any variable from 0 to 1 never results in the output changing from 1 to 0. Operations with this property are said to bemonotone.Thus the axioms thus far have all been for monotonic Boolean logic. Nonmonotonicity enters via complement ¬ as follows.[5]
Nonmonotone laws
editThe complement operation is defined by the following two laws.
All properties of negation including the laws below follow from the above two laws alone.[5]
In both ordinary and Boolean algebra, negation works by exchanging pairs of elements, hence in both algebras it satisfies the double negation law (also called involution law)
But whereasordinary algebrasatisfies the two laws
Boolean algebra satisfiesDe Morgan's laws:
Completeness
editThe laws listed above define Boolean algebra, in the sense that they entail the rest of the subject. The lawscomplementation1 and 2, together with the monotone laws, suffice for this purpose and can therefore be taken as one possiblecompleteset of laws oraxiomatizationof Boolean algebra. Every law of Boolean algebra follows logically from these axioms. Furthermore, Boolean algebras can then be defined as themodelsof these axioms as treated in§ Boolean algebras.
Writing down further laws of Boolean algebra cannot give rise to any new consequences of these axioms, nor can it rule out any model of them. In contrast, in a list of some but not all of the same laws, there could have been Boolean laws that did not follow from those on the list, and moreover there would have been models of the listed laws that were not Boolean algebras.
This axiomatization is by no means the only one, or even necessarily the most natural given that attention was not paid as to whether some of the axioms followed from others, but there was simply a choice to stop when enough laws had been noticed, treated further in§ Axiomatizing Boolean algebra.Or the intermediate notion of axiom can be sidestepped altogether by defining a Boolean law directly as anytautology,understood as an equation that holds for all values of its variables over 0 and 1.[25][26]All these definitions of Boolean algebra can be shown to be equivalent.
Duality principle
editPrinciple: If {X, R} is apartially ordered set,then {X, R(inverse)} is also a partially ordered set.
There is nothing special about the choice of symbols for the values of Boolean algebra. 0 and 1 could be renamed toαandβ,and as long as it was done consistently throughout, it would still be Boolean algebra, albeit with some obvious cosmetic differences.
But suppose 0 and 1 were renamed 1 and 0 respectively. Then it would still be Boolean algebra, and moreover operating on the same values. However, it would not be identical to our original Boolean algebra because now ∨ behaves the way ∧ used to do and vice versa. So there are still some cosmetic differences to show that the notation has been changed, despite the fact that 0s and 1s are still being used.
But if in addition to interchanging the names of the values, the names of the two binary operations are also interchanged,nowthere is no trace of what was done. The end product is completely indistinguishable from what was started with. The columns forx∧yandx∨yin the truth tables have changed places, but that switch is immaterial.
When values and operations can be paired up in a way that leaves everything important unchanged when all pairs are switched simultaneously, the members of each pair are calleddualto each other. Thus 0 and 1 are dual, and ∧ and ∨ are dual. Theduality principle,also calledDe Morgan duality,asserts that Boolean algebra is unchanged when all dual pairs are interchanged.
One change not needed to make as part of this interchange was to complement. Complement is aself-dualoperation. The identity or do-nothing operationx(copy the input to the output) is also self-dual. A more complicated example of a self-dual operation is(x∧y) ∨ (y∧z) ∨ (z∧x).There is no self-dual binary operation that depends on both its arguments. A composition of self-dual operations is a self-dual operation. For example, iff(x,y,z) = (x∧y) ∨ (y∧z) ∨ (z∧x),thenf(f(x,y,z),x,t)is a self-dual operation of four argumentsx,y,z,t.
The principle of duality can be explained from agroup theoryperspective by the fact that there are exactly four functions that are one-to-one mappings (automorphisms) of the set ofBoolean polynomialsback to itself: the identity function, the complement function, the dual function and the contradual function (complemented dual). These four functions form agroupunderfunction composition,isomorphic to theKlein four-group,actingon the set of Boolean polynomials.Walter Gottschalkremarked that consequently a more appropriate name for the phenomenon would be theprinciple(orsquare)of quaternality.[5]: 21–22
Diagrammatic representations
editVenn diagrams
editAVenn diagram[27]can be used as a representation of a Boolean operation using shaded overlapping regions. There is one region for each variable, all circular in the examples here. The interior and exterior of regionxcorresponds respectively to the values 1 (true) and 0 (false) for variablex.The shading indicates the value of the operation for each combination of regions, with dark denoting 1 and light 0 (some authors use the opposite convention).
The three Venn diagrams in the figure below represent respectively conjunctionx∧y,disjunctionx∨y,and complement ¬x.
For conjunction, the region inside both circles is shaded to indicate thatx∧yis 1 when both variables are 1. The other regions are left unshaded to indicate thatx∧yis 0 for the other three combinations.
The second diagram represents disjunctionx∨yby shading those regions that lie inside either or both circles. The third diagram represents complement ¬xby shading the regionnotinside the circle.
While we have not shown the Venn diagrams for the constants 0 and 1, they are trivial, being respectively a white box and a dark box, neither one containing a circle. However, we could put a circle forxin those boxes, in which case each would denote a function of one argument,x,which returns the same value independently ofx,called a constant function. As far as their outputs are concerned, constants and constant functions are indistinguishable; the difference is that a constant takes no arguments, called azeroaryornullaryoperation, while a constant function takes one argument, which it ignores, and is aunaryoperation.
Venn diagrams are helpful in visualizing laws. The commutativity laws for ∧ and ∨ can be seen from the symmetry of the diagrams: a binary operation that was not commutative would not have a symmetric diagram because interchangingxandywould have the effect of reflecting the diagram horizontally and any failure of commutativity would then appear as a failure of symmetry.
Idempotenceof ∧ and ∨ can be visualized by sliding the two circles together and noting that the shaded area then becomes the whole circle, for both ∧ and ∨.
To see the first absorption law,x∧ (x∨y) =x,start with the diagram in the middle forx∨yand note that the portion of the shaded area in common with thexcircle is the whole of thexcircle. For the second absorption law,x∨ (x∧y) =x,start with the left diagram forx∧yand note that shading the whole of thexcircle results in just thexcircle being shaded, since the previous shading was inside thexcircle.
The double negation law can be seen by complementing the shading in the third diagram for ¬x,which shades thexcircle.
To visualize the first De Morgan's law,(¬x) ∧ (¬y) = ¬(x∨y),start with the middle diagram forx∨yand complement its shading so that only the region outside both circles is shaded, which is what the right hand side of the law describes. The result is the same as if we shaded that region which is both outside thexcircleandoutside theycircle, i.e. the conjunction of their exteriors, which is what the left hand side of the law describes.
The second De Morgan's law,(¬x) ∨ (¬y) = ¬(x∧y),works the same way with the two diagrams interchanged.
The first complement law,x∧ ¬x= 0,says that the interior and exterior of thexcircle have no overlap. The second complement law,x∨ ¬x= 1,says that everything is either inside or outside thexcircle.
Digital logic gates
editDigital logic is the application of the Boolean algebra of 0 and 1 to electronic hardware consisting oflogic gatesconnected to form acircuit diagram.Each gate implements a Boolean operation, and is depicted schematically by a shape indicating the operation. The shapes associated with the gates for conjunction (AND-gates), disjunction (OR-gates), and complement (inverters) are as follows:[28]
The lines on the left of each gate represent input wires orports.The value of the input is represented by a voltage on the lead. For so-called "active-high" logic, 0 is represented by a voltage close to zero or "ground," while 1 is represented by a voltage close to the supply voltage; active-low reverses this. The line on the right of each gate represents the output port, which normally follows the same voltage conventions as the input ports.
Complement is implemented with an inverter gate. The triangle denotes the operation that simply copies the input to the output; the small circle on the output denotes the actual inversion complementing the input. The convention of putting such a circle on any port means that the signal passing through this port is complemented on the way through, whether it is an input or output port.
Theduality principle,orDe Morgan's laws,can be understood as asserting that complementing all three ports of an AND gate converts it to an OR gate and vice versa, as shown in Figure 4 below. Complementing both ports of an inverter however leaves the operation unchanged.
More generally, one may complement any of the eight subsets of the three ports of either an AND or OR gate. The resulting sixteen possibilities give rise to only eight Boolean operations, namely those with an odd number of 1s in their truth table. There are eight such because the "odd-bit-out" can be either 0 or 1 and can go in any of four positions in the truth table. There being sixteen binary Boolean operations, this must leave eight operations with an even number of 1s in their truth tables. Two of these are the constants 0 and 1 (as binary operations that ignore both their inputs); four are the operations that depend nontrivially on exactly one of their two inputs, namelyx,y,¬x,and ¬y;and the remaining two arex⊕y(XOR) and its complementx≡y.
Boolean algebras
editThe term "algebra" denotes both a subject, namely the subject ofalgebra,and an object, namely analgebraic structure.Whereas the foregoing has addressed the subject of Boolean algebra, this section deals with mathematical objects called Boolean algebras, defined in full generality as any model of the Boolean laws. We begin with a special case of the notion definable without reference to the laws, namely concrete Boolean algebras, and then givethe formal definitionof the general notion.
Concrete Boolean algebras
editAconcrete Boolean algebraorfield of setsis any nonempty set of subsets of a given setXclosed under the set operations ofunion,intersection,andcomplementrelative toX.[5]
(HistoricallyXitself was required to be nonempty as well to exclude the degenerate or one-element Boolean algebra, which is the one exception to the rule that all Boolean algebras satisfy the same equations since the degenerate algebra satisfies every equation. However, this exclusion conflicts with the preferred purely equational definition of "Boolean algebra", there being no way to rule out the one-element algebra using only equations— 0 ≠ 1 does not count, being a negated equation. Hence modern authors allow the degenerate Boolean algebra and letXbe empty.)
Example 1.Thepower set2XofX,consisting of allsubsetsofX.HereXmay be any set: empty, finite, infinite, or evenuncountable.
Example 2.The empty set andX.This two-element algebra shows that a concrete Boolean algebra can be finite even when it consists of subsets of an infinite set. It can be seen that every field of subsets ofXmust contain the empty set andX.Hence no smaller example is possible, other than the degenerate algebra obtained by takingXto be empty so as to make the empty set andXcoincide.
Example 3.The set of finite andcofinitesets of integers, where a cofinite set is one omitting only finitely many integers. This is clearly closed under complement, and is closed under union because the union of a cofinite set with any set is cofinite, while the union of two finite sets is finite. Intersection behaves like union with "finite" and "cofinite" interchanged. This example is countably infinite because there are only countably many finite sets of integers.
Example 4.For a less trivial example of the point made by example 2, consider aVenn diagramformed bynclosed curvespartitioningthe diagram into 2nregions, and letXbe the (infinite) set of all points in the plane not on any curve but somewhere within the diagram. The interior of each region is thus an infinite subset ofX,and every point inXis in exactly one region. Then the set of all 22npossible unions of regions (including the empty set obtained as the union of the empty set of regions andXobtained as the union of all 2nregions) is closed under union, intersection, and complement relative toXand therefore forms a concrete Boolean algebra. Again, there are finitely many subsets of an infinite set forming a concrete Boolean algebra, with example 2 arising as the casen= 0 of no curves.
Subsets as bit vectors
editA subsetYofXcan be identified with anindexed familyof bits withindex setX,with the bit indexed byx∈Xbeing 1 or 0 according to whether or notx∈Y.(This is the so-calledcharacteristic functionnotion of a subset.) For example, a 32-bit computer word consists of 32 bits indexed by the set {0,1,2,...,31}, with 0 and 31 inde xing the low and high order bits respectively. For a smaller example, ifwherea, b, care viewed as bit positions in that order from left to right, the eight subsets {}, {c}, {b}, {b,c}, {a}, {a,c}, {a,b}, and {a,b,c} ofXcan be identified with the respective bit vectors 000, 001, 010, 011, 100, 101, 110, and 111. Bit vectors indexed by the set of natural numbers are infinitesequencesof bits, while those indexed by therealsin theunit interval[0,1] are packed too densely to be able to write conventionally but nonetheless form well-defined indexed families (imagine coloring every point of the interval [0,1] either black or white independently; the black points then form an arbitrary subset of [0,1]).
From this bit vector viewpoint, a concrete Boolean algebra can be defined equivalently as a nonempty set of bit vectors all of the same length (more generally, indexed by the same set) and closed under the bit vector operations ofbitwise∧, ∨, and ¬, as in1010∧0110 = 0010,1010∨0110 = 1110,and¬1010 = 0101,the bit vector realizations of intersection, union, and complement respectively.
Prototypical Boolean algebra
editThe set {0,1} and its Boolean operations as treated above can be understood as the special case of bit vectors of length one, which by the identification of bit vectors with subsets can also be understood as the two subsets of a one-element set. This is called theprototypicalBoolean algebra, justified by the following observation.
- The laws satisfied by all nondegenerate concrete Boolean algebras coincide with those satisfied by the prototypical Boolean algebra.
This observation is proved as follows. Certainly any law satisfied by all concrete Boolean algebras is satisfied by the prototypical one since it is concrete. Conversely any law that fails for some concrete Boolean algebra must have failed at a particular bit position, in which case that position by itself furnishes a one-bit counterexample to that law. Nondegeneracy ensures the existence of at least one bit position because there is only one empty bit vector.
The final goal of the next section can be understood as eliminating "concrete" from the above observation. That goal is reached via the stronger observation that, up to isomorphism, all Boolean algebras are concrete.
Boolean algebras: the definition
editThe Boolean algebras so far have all been concrete, consisting of bit vectors or equivalently of subsets of some set. Such a Boolean algebra consists of a set and operations on that set which can beshownto satisfy the laws of Boolean algebra.
Instead of showing that the Boolean laws are satisfied, we can instead postulate a setX,two binary operations onX,and one unary operation, andrequirethat those operations satisfy the laws of Boolean algebra. The elements ofXneed not be bit vectors or subsets but can be anything at all. This leads to the more generalabstractdefinition.
- ABoolean algebrais any set with binary operations ∧ and ∨ and a unary operation ¬ thereon satisfying the Boolean laws.[29]
For the purposes of this definition it is irrelevant how the operations came to satisfy the laws, whether by fiat or proof. All concrete Boolean algebras satisfy the laws (by proof rather than fiat), whence every concrete Boolean algebra is a Boolean algebra according to our definitions. This axiomatic definition of a Boolean algebra as a set and certain operations satisfying certain laws or axiomsby fiatis entirely analogous to the abstract definitions ofgroup,ring,fieldetc. characteristic of modern orabstract algebra.
Given any complete axiomatization of Boolean algebra, such as the axioms for acomplementeddistributive lattice,a sufficient condition for analgebraic structureof this kind to satisfy all the Boolean laws is that it satisfy just those axioms. The following is therefore an equivalent definition.
- ABoolean algebrais a complemented distributive lattice.
The section onaxiomatizationlists other axiomatizations, any of which can be made the basis of an equivalent definition.
Representable Boolean algebras
editAlthough every concrete Boolean algebra is a Boolean algebra, not every Boolean algebra need be concrete. Letnbe asquare-freepositive integer, one not divisible by the square of an integer, for example 30 but not 12. The operations ofgreatest common divisor,least common multiple,and division inton(that is, ¬x=n/x), can be shown to satisfy all the Boolean laws when their arguments range over the positive divisors ofn.Hence those divisors form a Boolean algebra. These divisors are not subsets of a set, making the divisors ofna Boolean algebra that is not concrete according to our definitions.
However, if each divisor ofnisrepresentedby the set of its prime factors, this nonconcrete Boolean algebra isisomorphicto the concrete Boolean algebra consisting of all sets of prime factors ofn,with union corresponding to least common multiple, intersection to greatest common divisor, and complement to division inton.So this example, while not technically concrete, is at least "morally" concrete via this representation, called anisomorphism.This example is an instance of the following notion.
- A Boolean algebra is calledrepresentablewhen it is isomorphic to a concrete Boolean algebra.
The next question is answered positively as follows.
- Every Boolean algebra is representable.
That is, up to isomorphism, abstract and concrete Boolean algebras are the same thing. This result depends on theBoolean prime ideal theorem,a choice principle slightly weaker than theaxiom of choice.This strong relationship implies a weaker result strengthening the observation in the previous subsection to the following easy consequence of representability.
- The laws satisfied by all Boolean algebras coincide with those satisfied by the prototypical Boolean algebra.
It is weaker in the sense that it does not of itself imply representability. Boolean algebras are special here, for example arelation algebrais a Boolean algebra with additional structure but it is not the case that every relation algebra is representable in the sense appropriate to relation algebras.
Axiomatizing Boolean algebra
editThe above definition of an abstract Boolean algebra as a set together with operations satisfying "the" Boolean laws raises the question of what those laws are. A simplistic answer is "all Boolean laws", which can be defined as all equations that hold for the Boolean algebra of 0 and 1. However, since there are infinitely many such laws, this is not a satisfactory answer in practice, leading to the question of it suffices to require only finitely many laws to hold.
In the case of Boolean algebras, the answer is "yes": the finitely many equations listed above are sufficient. Thus, Boolean algebra is said to befinitely axiomatizableorfinitely based.
Moreover, the number of equations needed can be further reduced. To begin with, some of the above laws are implied by some of the others. A sufficient subset of the above laws consists of the pairs of associativity, commutativity, and absorption laws, distributivity of ∧ over ∨ (or the other distributivity law—one suffices), and the two complement laws. In fact, this is the traditional axiomatization of Boolean algebra as acomplementeddistributive lattice.
By introducing additional laws not listed above, it becomes possible to shorten the list of needed equations yet further; for instance, with the vertical bar representing theSheffer strokeoperation, the single axiomis sufficient to completely axiomatize Boolean algebra. It is also possible to find longer single axioms using more conventional operations; seeMinimal axioms for Boolean algebra.[30]
Propositional logic
editPropositional logicis alogical systemthat is intimately connected to Boolean algebra.[5]Many syntactic concepts of Boolean algebra carry over to propositional logic with only minor changes in notation and terminology, while the semantics of propositional logic are defined via Boolean algebras in a way that the tautologies (theorems) of propositional logic correspond to equational theorems of Boolean algebra.
Syntactically, every Boolean term corresponds to apropositional formulaof propositional logic. In this translation between Boolean algebra and propositional logic, Boolean variablesx, y,... becomepropositional variables(oratoms)P, Q,... Boolean terms such asx∨ybecome propositional formulasP∨Q;0 becomesfalseor⊥,and 1 becomestrueorT.It is convenient when referring to generic propositions to use Greek letters Φ, Ψ,... as metavariables (variables outside the language of propositional calculus, used when talkingaboutpropositional calculus) to denote propositions.
The semantics of propositional logic rely ontruth assignments.The essential idea of a truth assignment is that the propositional variables are mapped to elements of a fixed Boolean algebra, and then thetruth valueof a propositional formula using these letters is the element of the Boolean algebra that is obtained by computing the value of the Boolean term corresponding to the formula. In classical semantics, only the two-element Boolean algebra is used, while inBoolean-valued semanticsarbitrary Boolean algebras are considered. Atautologyis a propositional formula that is assigned truth value1by every truth assignment of its propositional variables to an arbitrary Boolean algebra (or, equivalently, every truth assignment to the two element Boolean algebra).
These semantics permit a translation between tautologies of propositional logic and equational theorems of Boolean algebra. Every tautology Φ of propositional logic can be expressed as the Boolean equation Φ = 1, which will be a theorem of Boolean algebra. Conversely, every theorem Φ = Ψ of Boolean algebra corresponds to the tautologies (Φ ∨ ¬Ψ) ∧ (¬Φ ∨ Ψ) and (Φ ∧ Ψ) ∨ (¬Φ ∧ ¬Ψ). If → is in the language, these last tautologies can also be written as (Φ → Ψ) ∧ (Ψ → Φ), or as two separate theorems Φ → Ψ and Ψ → Φ; if ≡ is available, then the single tautology Φ ≡ Ψ can be used.
Applications
editOne motivating application of propositional calculus is the analysis of propositions and deductive arguments in natural language.[31]Whereas the proposition "ifx= 3, thenx+ 1 = 4 "depends on the meanings of such symbols as + and 1, the proposition" ifx= 3, thenx= 3 "does not; it is true merely by virtue of its structure, and remains true whether"x= 3 "is replaced by"x= 4 "or" the moon is made of green cheese. "The generic or abstract form of this tautology is" ifP,thenP,"or in the language of Boolean algebra,P→P.[citation needed]
ReplacingPbyx= 3 or any other proposition is calledinstantiationofPby that proposition. The result of instantiatingPin an abstract proposition is called aninstanceof the proposition. Thus,x= 3 →x= 3 is a tautology by virtue of being an instance of the abstract tautologyP→P.All occurrences of the instantiated variable must be instantiated with the same proposition, to avoid such nonsense asP→x= 3 orx= 3 →x= 4.
Propositional calculus restricts attention to abstract propositions, those built up from propositional variables using Boolean operations. Instantiation is still possible within propositional calculus, but only by instantiating propositional variables by abstract propositions, such as instantiatingQbyQ→PinP→ (Q→P) to yield the instanceP→ ((Q→P) →P).
(The availability of instantiation as part of the machinery of propositional calculus avoids the need for metavariables within the language of propositional calculus, since ordinary propositional variables can be considered within the language to denote arbitrary propositions. The metavariables themselves are outside the reach of instantiation, not being part of the language of propositional calculus but rather part of the same language for talking about it that this sentence is written in, where there is a need to be able to distinguish propositional variables and their instantiations as being distinct syntactic entities.)
Deductive systems for propositional logic
editAn axiomatization of propositional calculus is a set of tautologies calledaxiomsand one or more inference rules for producing new tautologies from old. Aproofin an axiom systemAis a finite nonempty sequence of propositions each of which is either an instance of an axiom ofAor follows by some rule ofAfrom propositions appearing earlier in the proof (thereby disallowing circular reasoning). The last proposition is thetheoremproved by the proof. Every nonempty initial segment of a proof is itself a proof, whence every proposition in a proof is itself a theorem. An axiomatization issoundwhen every theorem is a tautology, andcompletewhen every tautology is a theorem.[32]
Sequent calculus
editPropositional calculus is commonly organized as aHilbert system,whose operations are just those of Boolean algebra and whose theorems are Boolean tautologies, those Boolean terms equal to the Boolean constant 1. Another form issequent calculus,which has two sorts, propositions as in ordinary propositional calculus, and pairs of lists of propositions calledsequents,such asA∨B,A∧C,... ⊢A,B→C,....The two halves of a sequent are called the antecedent and the succedent respectively. The customary metavariable denoting an antecedent or part thereof is Γ, and for a succedent Δ; thus Γ,A⊢ Δ would denote a sequent whose succedent is a list Δ and whose antecedent is a list Γ with an additional propositionAappended after it. The antecedent is interpreted as the conjunction of its propositions, the succedent as the disjunction of its propositions, and the sequent itself as theentailmentof the succedent by the antecedent.
Entailment differs from implication in that whereas the latter is a binaryoperationthat returns a value in a Boolean algebra, the former is a binaryrelationwhich either holds or does not hold. In this sense, entailment is anexternalform of implication, meaning external to the Boolean algebra, thinking of the reader of the sequent as also being external and interpreting and comparing antecedents and succedents in some Boolean algebra. The natural interpretation of ⊢ is as ≤ in the partial order of the Boolean algebra defined byx≤yjust whenx∨y=y.This ability to mix external implication ⊢ and internal implication → in the one logic is among the essential differences between sequent calculus and propositional calculus.[33]
Applications
editBoolean algebra as the calculus of two values is fundamental to computer circuits, computer programming, and mathematical logic, and is also used in other areas of mathematics such as set theory and statistics.[5]
Computers
editIn the early 20th century, several electrical engineers[who?]intuitively recognized that Boolean algebra was analogous to the behavior of certain types of electrical circuits.Claude Shannonformally proved such behavior was logically equivalent to Boolean algebra in his 1937 master's thesis,A Symbolic Analysis of Relay and Switching Circuits.
Today, all modern general-purposecomputersperform their functions using two-value Boolean logic; that is, their electrical circuits are a physical manifestation of two-value Boolean logic. They achieve this in various ways: asvoltages on wiresin high-speed circuits and capacitive storage devices, as orientations of amagnetic domainin ferromagnetic storage devices, as holes inpunched cardsorpaper tape,and so on. (Some early computers used decimal circuits or mechanisms instead of two-valued logic circuits.)
Of course, it is possible to code more than two symbols in any given medium. For example, one might use respectively 0, 1, 2, and 3 volts to code a four-symbol Alpha bet on a wire, or holes of different sizes in a punched card. In practice, the tight constraints of high speed, small size, and low power combine to make noise a major factor. This makes it hard to distinguish between symbols when there are several possible symbols that could occur at a single site. Rather than attempting to distinguish between four voltages on one wire, digital designers have settled on two voltages per wire, high and low.
Computers use two-value Boolean circuits for the above reasons. The most common computer architectures use ordered sequences of Boolean values, called bits, of 32 or 64 values, e.g. 01101000110101100101010101001011. When programming inmachine code,assembly language,and certain otherprogramming languages,programmers work with the low-level digital structure of thedata registers.These registers operate on voltages, where zero volts represents Boolean 0, and a reference voltage (often +5 V, +3.3 V, or +1.8 V) represents Boolean 1. Such languages support both numeric operations and logical operations. In this context, "numeric" means that the computer treats sequences of bits asbinary numbers(base two numbers) and executes arithmetic operations like add, subtract, multiply, or divide. "Logical" refers to the Boolean logical operations of disjunction, conjunction, and negation between two sequences of bits, in which each bit in one sequence is simply compared to its counterpart in the other sequence. Programmers therefore have the option of working in and applying the rules of either numeric algebra or Boolean algebra as needed. A core differentiating feature between these families of operations is the existence of thecarryoperation in the first but not the second.
Two-valued logic
editOther areas where two values is a good choice are the law and mathematics. In everyday relaxed conversation, nuanced or complex answers such as "maybe" or "only on the weekend" are acceptable. In more focused situations such as a court of law or theorem-based mathematics, however, it is deemed advantageous to frame questions so as to admit a simple yes-or-no answer—is the defendant guilty or not guilty, is the proposition true or false—and to disallow any other answer. However, limiting this might prove in practice for the respondent, the principle of the simple yes–no question has become a central feature of both judicial and mathematical logic, makingtwo-valued logicdeserving of organization and study in its own right.
A central concept of set theory is membership. An organization may permit multiple degrees of membership, such as novice, associate, and full. With sets, however, an element is either in or out. The candidates for membership in a set work just like the wires in a digital computer: each candidate is either a member or a nonmember, just as each wire is either high or low.
Algebra being a fundamental tool in any area amenable to mathematical treatment, these considerations combine to make the algebra of two values of fundamental importance to computer hardware, mathematical logic, and set theory.
Two-valued logic can be extended tomulti-valued logic,notably by replacing the Boolean domain {0, 1} with the unit interval [0,1], in which case rather than only taking values 0 or 1, any value between and including 0 and 1 can be assumed. Algebraically, negation (NOT) is replaced with 1 −x,conjunction (AND) is replaced with multiplication (xy), and disjunction (OR) is defined viaDe Morgan's law.Interpreting these values as logicaltruth valuesyields a multi-valued logic, which forms the basis forfuzzy logicandprobabilistic logic.In these interpretations, a value is interpreted as the "degree" of truth – to what extent a proposition is true, or the probability that the proposition is true.
Boolean operations
editThe original application for Boolean operations wasmathematical logic,where it combines the truth values, true or false, of individual formulas.
Natural language
editNatural languages such as English have words for several Boolean operations, in particular conjunction (and), disjunction (or), negation (not), and implication (implies).But notis synonymous withand not.When used to combine situational assertions such as "the block is on the table" and "cats drink milk", which naïvely are either true or false, the meanings of theselogical connectivesoften have the meaning of their logical counterparts. However, with descriptions of behavior such as "Jim walked through the door", one starts to notice differences such as failure of commutativity, for example, the conjunction of "Jim opened the door" with "Jim walked through the door" in that order is not equivalent to their conjunction in the other order, sinceandusually meansand thenin such cases. Questions can be similar: the order "Is the sky blue, and why is the sky blue?" makes more sense than the reverse order. Conjunctive commands about behavior are like behavioral assertions, as inget dressed and go to school.Disjunctive commands suchlove me or leave meorfish or cut baittend to be asymmetric via the implication that one alternative is less preferable. Conjoined nouns such astea and milkgenerally describe aggregation as with set union whiletea or milkis a choice. However, context can reverse these senses, as inyour choices are coffee and teawhich usually means the same asyour choices are coffee or tea(alternatives). Double negation, as in "I don't not like milk", rarely means literally "I do like milk" but rather conveys some sort of hedging, as though to imply that there is a third possibility. "Not not P" can be loosely interpreted as "surely P", and althoughPnecessarily implies "not notP,"the converse is suspect in English, much as withintuitionistic logic.In view of the highly idiosyncratic usage of conjunctions in natural languages, Boolean algebra cannot be considered a reliable framework for interpreting them.
Digital logic
editBoolean operations are used indigital logicto combine the bits carried on individual wires, thereby interpreting them over {0,1}. When a vector ofnidentical binary gates are used to combine two bit vectors each ofnbits, the individual bit operations can be understood collectively as a single operation on values from aBoolean algebrawith 2nelements.
Naive set theory
editNaive set theoryinterprets Boolean operations as acting on subsets of a given setX.As we saw earlier this behavior exactly parallels the coordinate-wise combinations of bit vectors, with the union of two sets corresponding to the disjunction of two bit vectors and so on.
Video cards
editThe 256-element free Boolean algebra on three generators is deployed incomputer displaysbased onraster graphics,which usebit blitto manipulate whole regions consisting ofpixels,relying on Boolean operations to specify how the source region should be combined with the destination, typically with the help of a third region called themask.Modernvideo cardsoffer all223= 256ternary operations for this purpose, with the choice of operation being a one-byte (8-bit) parameter. The constantsSRC = 0xaaor0b10101010,DST = 0xccor0b11001100,andMSK = 0xf0or0b11110000allow Boolean operations such as(SRC^DST)&MSK
(meaning XOR the source and destination and then AND the result with the mask) to be written directly as a constant denoting a byte calculated at compile time,0x80in the(SRC^DST)&MSK
example,0x88if justSRC^DST
,etc. At run time the video card interprets the byte as the raster operation indicated by the original expression in a uniform way that requires remarkably little hardware and which takes time completely independent of the complexity of the expression.
Modeling and CAD
editSolid modelingsystems forcomputer aided designoffer a variety of methods for building objects from other objects, combination by Boolean operations being one of them. In this method the space in which objects exist is understood as a setSofvoxels(the three-dimensional analogue of pixels in two-dimensional graphics) and shapes are defined as subsets ofS,allowing objects to be combined as sets via union, intersection, etc. One obvious use is in building a complex shape from simple shapes simply as the union of the latter. Another use is in sculpting understood as removal of material: any grinding, milling, routing, or drilling operation that can be performed with physical machinery on physical materials can be simulated on the computer with the Boolean operationx∧ ¬yorx−y,which in set theory is set difference, remove the elements ofyfrom those ofx.Thus given two shapes one to be machined and the other the material to be removed, the result of machining the former to remove the latter is described simply as their set difference.
Boolean searches
editSearch engine queries also employ Boolean logic. For this application, each web page on the Internet may be considered to be an "element" of a "set." The following examples use a syntax supported byGoogle.[NB 1]
- Doublequotes are used to combine whitespace-separated words into a single search term.[NB 2]
- Whitespace is used to specify logical AND, as it is the default operator for joining search terms:
"Search term 1" "Search term 2"
- The OR keyword is used for logical OR:
"Search term 1" OR "Search term 2"
- A prefixed minus sign is used for logical NOT:
"Search term 1" − "Search term 2"
See also
editNotes
edit- ^Not all search engines support the same query syntax. Additionally, some organizations (such as Google) provide "specialized" search engines that support alternate or extended syntax. (See,Syntax cheatsheet.) The now-defunct Google code search used to support regular expressions but no longer exists.
- ^Doublequote-delimited search terms are called "exact phrase" searches in the Google documentation.
References
edit- ^Boole, George(2011-07-28).The Mathematical Analysis of Logic - Being an Essay Towards a Calculus of Deductive Reasoning.
- ^Boole, George(2003) [1854].An Investigation of the Laws of Thought.Prometheus Books.ISBN978-1-59102-089-9.
- ^"The name Boolean algebra (or Boolean 'algebras') for the calculus originated by Boole, extended by Schröder, and perfected by Whitehead seems to have been first suggested by Sheffer, in 1913."Edward Vermilye Huntington,"New sets of independent postulates for the algebra of logic, with special reference to Whitehead and Russell'sPrincipia mathematica",inTransactions of the American Mathematical Society35(1933), 274-304; footnote, page 278.
- ^Peirce, Charles S.(1931).Collected Papers.Vol. 3.Harvard University Press.p. 13.ISBN978-0-674-13801-8.
- ^abcdefgGivant, Steven R.;Halmos, Paul Richard(2009).Introduction to Boolean Algebras.Undergraduate Texts in Mathematics,Springer.pp. 21–22.ISBN978-0-387-40293-2.
- ^Nelson, Eric S. (2011)."The Yijing and Philosophy: From Leibniz to Derrida".Journal of Chinese Philosophy.38(3): 377–396.doi:10.1111/j.1540-6253.2011.01661.x.
- ^Lenzen, Wolfgang."Leibniz: Logic".Internet Encyclopedia of Philosophy.
- ^abcDunn, J. Michael; Hardegree, Gary M. (2001).Algebraic methods in philosophical logic.Oxford University Press.p. 2.ISBN978-0-19-853192-0.
- ^Weisstein, Eric W."Boolean Algebra".mathworld.wolfram.Retrieved2020-09-02.
- ^Balabanian, Norman; Carlson, Bradley (2001).Digital logic design principles.John Wiley. pp. 39–40.ISBN978-0-471-29351-4.,online sample
- ^Rajaraman; Radhakrishnan (2008-03-01).Introduction To Digital Computer Design.PHI Learning Pvt. Ltd. p. 65.ISBN978-81-203-3409-0.
- ^Camara, John A. (2010).Electrical and Electronics Reference Manual for the Electrical and Computer PE Exam.ppi2pass. p. 41.ISBN978-1-59126-166-7.
- ^Shin-ichi Minato, Saburo Muroga (2007). "Chapter 29: Binary Decision Diagrams". In Chen, Wai-Kai (ed.).The VLSI handbook(2 ed.).CRC Press.ISBN978-0-8493-4199-1.
- ^Parkes, Alan (2002).Introduction to languages, machines and logic: computable languages, abstract machines and formal logic.Springer. p. 276.ISBN978-1-85233-464-2.
- ^Barwise, Jon;Etchemendy, John;Allwein, Gerard; Barker-Plummer, Dave; Liu, Albert (1999).Language, proof, and logic.CSLI Publications.ISBN978-1-889119-08-3.
- ^Goertzel, Ben (1994).Chaotic logic: language, thought, and reality from the perspective of complex systems science.Springer. p. 48.ISBN978-0-306-44690-0.
- ^Halmos, Paul Richard(1963). Lectures on Boolean Algebras. van Nostrand.
- ^Bacon, Jason W. (2011)."Computer Science 315 Lecture Notes".Archived fromthe originalon 2021-10-02.Retrieved2021-10-01.
- ^"Boolean Algebra - Expression, Rules, Theorems, and Examples".GeeksforGeeks.2021-09-24.Retrieved2024-06-03.
- ^"Boolean Logical Operations"(PDF).
- ^"Boolean Algebra Operations".bob.cs.sonoma.edu.Retrieved2024-06-03.
- ^"Boolean Algebra"(PDF).
- ^O'Regan, Gerard (2008).A brief history of computing.Springer. p. 33.ISBN978-1-84800-083-4.
- ^"Elements of Boolean Algebra".ee.surrey.ac.uk.Retrieved2020-09-02.
- ^McGee, Vann,Sentential Calculus Revisited: Boolean Algebra(PDF)
- ^Goodstein, Reuben Louis(2012), "Chapter 4: Sentence Logic",Boolean Algebra,Courier Dover Publications,ISBN978-0-48615497-8
- ^Venn, John(July 1880)."I. On the Diagrammatic and Mechanical Representation of Propositions and Reasonings"(PDF).The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science.5.10(59): 1–18.doi:10.1080/14786448008626877.Archived(PDF)from the original on 2017-05-16.[1][2]
- ^Shannon, Claude(1949). "The Synthesis of Two-Terminal Switching Circuits".Bell System Technical Journal.28:59–98.doi:10.1002/j.1538-7305.1949.tb03624.x.
- ^Koppelberg, Sabine (1989). "General Theory of Boolean Algebras".Handbook of Boolean Algebras, Vol. 1 (ed. J. Donald Monk with Robert Bonnet).Amsterdam, Netherlands:North Holland.ISBN978-0-444-70261-6.
- ^McCune, William;Veroff, Robert;Fitelson, Branden;Harris, Kenneth; Feist, Andrew;Wos, Larry(2002), "Short single axioms for Boolean algebra",Journal of Automated Reasoning,29(1): 1–16,doi:10.1023/A:1020542009983,MR1940227,S2CID207582048
- ^Allwood, Jens; Andersson, Gunnar-Gunnar; Andersson, Lars-Gunnar; Dahl, Osten (1977-09-15).Logic in Linguistics.Cambridge University Press.ISBN978-0-521-29174-3.
- ^Hausman, Alan; Kahane, Howard; Tidman, Paul (2010) [2007].Logic and Philosophy: A Modern Introduction.Wadsworth Cengage Learning.ISBN978-0-495-60158-6.
- ^Girard, Jean-Yves;Taylor, Paul; Lafont, Yves (1990) [1989].Proofs and Types.Cambridge University Press(Cambridge Tracts in Theoretical Computer Science, 7).ISBN978-0-521-37181-0.
Further reading
edit- Mano, Morris; Ciletti, Michael D. (2013).Digital Design.Pearson.ISBN978-0-13-277420-8.
- Whitesitt, J. Eldon (1995).Boolean algebra and its applications.Courier Dover Publications.ISBN978-0-486-68483-3.
- Dwinger, Philip (1971).Introduction to Boolean algebras.Würzburg, Germany: Physica Verlag.
- Sikorski, Roman(1969).Boolean Algebras(3 ed.). Berlin, Germany:Springer-Verlag.ISBN978-0-387-04469-9.
- Bocheński, Józef Maria(1959).A Précis of Mathematical Logic.Translated from the French and German editions by Otto Bird. Dordrecht, South Holland: D. Reidel.
Historical perspective
edit- Boole, George(1848)."The Calculus of Logic".Cambridge and Dublin Mathematical Journal.III:183–198.
- Hailperin, Theodore (1986).Boole's logic and probability: a critical exposition from the standpoint of contemporary algebra, logic, and probability theory(2 ed.).Elsevier.ISBN978-0-444-87952-3.
- Gabbay, Dov M.; Woods, John, eds. (2004).The rise of modern logic: from Leibniz to Frege.Handbook of the History of Logic. Vol. 3.Elsevier.ISBN978-0-444-51611-4.,several relevant chapters by Hailperin, Valencia, and Grattan-Guinness
- Badesa, Calixto (2004). "Chapter 1. Algebra of Classes and Propositional Calculus".The birth of model theory: Löwenheim's theorem in the frame of the theory of relatives.Princeton University Press.ISBN978-0-691-05853-5.
- Stanković, Radomir S.[in German];Astola, Jaakko Tapio[in Finnish](2011). Written at Niš, Serbia & Tampere, Finland.From Boolean Logic to Switching Circuits and Automata: Towards Modern Information Technology.Studies in Computational Intelligence. Vol. 335 (1 ed.). Berlin & Heidelberg, Germany:Springer-Verlag.pp. xviii + 212.doi:10.1007/978-3-642-11682-7.ISBN978-3-642-11681-0.ISSN1860-949X.LCCN2011921126.Retrieved2022-10-25.
- "The Algebra of Logic Tradition"entry by Burris, Stanley in theStanford Encyclopedia of Philosophy,21 February 2012