"there is no truth, if there were it could not be comprehended-and if comprehended, it could not be communicated" - character in a Plato dialog
As I mention in my previous post about Celestial Mechanics and Chaos theory, chaos theory shows the idea that modern science is defined by data collecting and putting into Cartesian coordinates does not define science is wrong; but, that doesn't mean mathematics is wrong. Similarly, mathematicians talk about foundations of mathematics. But, these foundations of mathematics are not about the origin and nature of mathematics, but issues that comes after mathematics is created. The mathematicians concern over the foundations of mathematics are about what logic is the foundation of mathematics. Well, some argue whether logic is the foundation of mathematics such as set theoretic foundations. But, most of these foundation wars are what sets of logical axioms are a foundation of mathematics. There's the Bertrand Russel Logicist. David Hilbert had his axioms stripped of all meaning. The intuitionist/constructivists are not stricktly against logical axioms, but whether certain axioms of the Russel/Whitehead "Principia" are valid.
Bertrand Russel and Alfred North Whitehead produced a massive three volume I do believe tomb of the foundation of mathematics. They tried to show that one could derive all mathematics from symbolic logic itself. They derived much of symbolic logic, then set theory and then numbers in terms of all this symbolic logic. Mathematicians love to exhibit the Russel/Whitehead passage proving that 1+1=2,
But nowhere in Bertrand Russel and Alfred North Whitehead's Principa Mathematica, or of any of the foundational schools is there any discussion of abstraction, idealization, much less Jacob Bronowski's vast generalization of how abstraction works in mathematical knowledge. There is Suzanne K Langer's "Introduction to Symbolic Logic" however. She does show how symbolic logic can be derived from language and abstraction; but, once again, she is not cognizant of Jacob Bronowski's "Origin of Knowledge and Imagination."
Still, mathematicians have created a massive amount of theory of logic; some would call it mathematical logic. This great out-pooring of logic actually goes back to Boole in the early 1800s. But, the 1900s were no less exciting as I hope to show.
The path I'm going to show I would say actually goes back to Grassman and Dedekind. John Stillwell has been championing this throughout all his books. I would include the name of Poincare in this as well. As E.T. Bell mentions in his "Development of Mathematics", Poincare suggested that mathematical induction can provide a foundation of all of mathematics. E.T. Bell's refutation wasn't much. Grassman and Dedekind showed that numbers and some what were previously considered axioms of numbers and arithmetic, commutativity and associativity, can be derived by mathematical induction. This suggestion was passed over by mathematicians for a long time. As far as I can tell, John Stillwell is the real champion here.
John Stillwell has tried to write a series of books that relates ancient mathematics to modern mathematics. In so doing, he tries to show what mathematics of the past is worth learning. I disagree a little bit. Sir Thomas Heath made editions of Euclid's Elements where he puts as notes the findings of Proclus(5th century A.D.; the last mathematical light till the dark ages were over) commentary. The connections and hence importance of the various theorems throughout Euclid's Elements to the Pythagoreans and Thales are just so great, I cannot agree that Mr Stillwell's sifting should forever shut all the rest of Greek mathematics in the past. For instance, Heath shows there's connections between the great Thales theorem about every angle in a semi-circle is a right angle and how every triangles angles sum to 180 degrees. He further shows in book six, theorem 11 I do believe, well, the theorem is the geometric interpretation of finding the square root of a number. The theorem uses Thales theorem about every angle in a semi-circle is a right angle. This theorem further shows that the great Thaetateous book 11 about making a vast generalization of roots in terms of infinit series(the three kinds, arithmetic/geometric/harmonic) is a valid one. Eudoxus goes on in the books about the Pythagorean solids to use this mathematics to calculate the angles of the Pythagorean solids mathematically and not empirically! This is pretty much the big accomplishment of Euclid's Elements. There's others like Eudoxus's book five, which is about proportions and essentially creates a geometric version of the real number system. I mean to suggest here the great vision of Greek mathematics. Like the alien arithmetic that is certainly antiquated today, but was a tremendously creative effort back then, Greek mathematics is certain antiquated as far as today's more general accounts; but, they work and are great creative efforts.
John Stillwell is also hoping to be able to teach advanced mathematics from understanding ancient Greek mathematics; I definitely disagree there. Modern mathematics solves ancient mathematics; modern mathematics sheds light on ancient mathematics and not the other way around.
Getting back to John Stillwell . . . he shows in almost all his books this mathematical induction derivation of numbers. In the 1800s, mathematicians found they could explain numbers like never before in terms of set theory. This should be familiar to you. Two sets of oranges and two sets of apples are concrete representations of the number two. But, a logician/mathematician Peano showed that there are number systems defined by the usual axioms of commutativity, associativity, and the distributive law. And, that we can distinguish the natural numbers from other number systems by another set of axioms called the Peano postulates. Off the top of my head, this is every number has a successor, each successor leads to a new successor. John Stillwell(Grassman/Dedikind before him) shows that this successor function defines mathematical induction.
-If 0 is a S and if n+1 is a S when n is a S, then N(symbol for natural numbers) is a subset of S. There's a mathematical symbol for "is a" that I can't duplicate here! Also, mathematicians will point out that one must distinguish between "is" and "is a"; see Suzanne K Langer's "Introduction to Symbolic Logic" for a great account of this and much else besides! . . . two more axioms of mathematical induction . . .If 0 is a S, and if n+1 is a S when 0,1 . . .,n is a S, then N is a subset of S, and the third rule, If T is a subset of N is nonempty then T has a least member.
John Stillwell proves that these can be proved equivalent. Once again, proving things opens up more mathematics. Proving one way leads to certain mathematics, proving another way leads to other mathematics. In Algebra, Galois disproof of the impossibility of a quadratic formula, or a closed form solution, for the fifth degree is generally impossible leads to group and field theor. Abel's proof leads to elliptic functions and topology(a revelation of the 20th century). John Stillwell later shows(not important here) that these three rules have importance at different times. Lets get to defining addition by mathematical induction.
m+0=m for all m 'is a' N . . . this is the 0 base induction step
m+(k+1)=(m+K)+1 for all m,k 'is a' N . . . this is of course the n+1 induction step. So we went from equally just m, to m+k. Addition is defined.
Associativity proof, base step n=0: L+(m+0)=L+m since m+0=m by the definition of addition proved above . . . if you zero the third term, you get the same thing (L+m)+0 lol!
the Induction step for the Associativity proof; n=k+1, l+(m+(k+1)=L+((m+k)+1 . . . by definition of + above,
=(L+m) + (k+1) notice k+1 which we started by substituting for n shows up again!
Commutativity, If you zero out n(base step), you get m=m! m+(k+1)=(m+k)+1 , this is by definition of addition, then = (k+m)=1 by the assumption; the assumption is substituted. Then (k+1)+m by associativity.
- And so numbers and basic properties can be proved by mathematical induction. This is all fine, but why take this over say set theory? Mathematicians through the twentieth century have developed mathematical logic and generalisations of logic in terms of mathematical induction. But, this is getting ahead of our story.
While George Boole's algebraic logic led to Bertrand Russel and Alfred North Whiteheads "Principia", and much other symbolic logic developments(like Post's three and even more valued logics), Kurt Gödel used a diagonal argument(mentioned in my previous post about Celestial Mechanics) to prove some incompleteness and consistency theorems. Kurt proved that a finite set of consistent axioms cannot prove an infinity of truths(the goal of Bertrand Russel and Alfred North Whitehead). Gödel's proofs showed the relation between logic and infinity is harder than mathematicians had hoped. One could instantly remark that 'of course' the relation between logical proof and infinity has not been explored enough. But, I'd like to note that mathematians would point out that the necessity for proof is to deal with cases where one can't check every case, whether the set of cases in just very large or infinit. But, proof does more than this.
Logical proof questions assumptions, and in the process reveals hidden structure. Right off the top of my head, like in Euclid's Elements, there's a series of like four theorems. The Greeks here were proving that a triangle is 1/2 times base times height. But, in proving, they found some more phenomenon. They found that areas stay the same no matter how inclined an angle of the triangle or even a square. The easy proof of the area of a triangle is is to divide a square or a parallelogram and say, see! the triagle is half the square and the squares area is base time height. The Greeks found four cases of triangles under the same base and height and parallelograms under the same base and height, or equivalent bases. I'd like to point out some new thoughts I've had about my "Nature and Origins of Mathematical Knowledge article".
Suzanne K. Langer points out in her "Introduction to Symbolic Logic", how abstraction works. That of structures. A structure is a relation between elements(two or more, but usually just two). A structure can have its relation held the same while the elements change. Or, one could change the relations between the elements. The first case leads to abstraction. She calls the relations that make up these structures/abstractions/concepts, constituent relations. People will say to prove anything, one needs some concepts formed first. That there's some kind of creative process before logical proof. Maybe, and here we have it - constituent relations. But then, what about relations between these propositions? She suggests these are your logical relations. This point of constituent and logical relations could generalize Jacob Bronowski's ideas about inferred units. As Jacob Bronowski himself says, inferred units could be had by either specialization(his word for idealization), or generalization(which leads to abstraction). Why not logical proof leads to new ideas? An idea is expressed as a propostion. Logical proof leads to ways of transforming from one way of viewing things to another(kind of like being able to see that our current perspective leads to a flat earth, but this leads to problems like retrograde motion of planets), or reexpressing one idea in terms of another, and that's an act of creativity. The Greek discovery of irrational numbers is of course the famous example of discovery by logical proof.
Another example of creativity by logical proof, and how new processes of mathematical discovery reveal themselves as mathematics evolves is that the non-Euclidean geometries of the early 1800s due to Gauss, Bolyei, and Lobochevski. This type of creativity is about changing up one axiom and sometimes more of the Euclidean axioms of Geometry. In particular, they changed up the Parallel axiom.
So, just because the issue of infinity and logic may not be thouroughly explored, logical proof has it's place and relation with mathematics. A relation that has been revealed more and more through the history of mathematics. But, let's explore infinity and its affect on mathematics as a whole.
Infinity really goes back to the beginning of numbers. Whoever first realized that numbers can seemingling be extended to forever must have freaked and believed in god thereafter! As I've pointed out in the beginning of my blog, the Egyptians and Babylonians came up with some creative mathematics to deal with the infinities that numbers lead to. The Greeks started to try to deal with geometric shapes that were smooth instead of easy straight lines, and then Calculus first came in the heads of Isaac Newton and Liebniz. With the creation of the Calculus, algebraic, trigonometric, and logarithms were unified and handled much more easily than they were before. Galileo as noted in my previous blog entry about Celestial Mechanics found that one could match up say all the even numbers with all the natural numbers(even and odd combined). That's where matters stood for centuries till George Cantor in the 1800s.
I kind of need to get back to set theory definition of numbers! The set theory definition of numbers uses properties(inferred units) of cardinality and ordinal numbers. Cardinal numbers are same set. At which point you say why bother with the ordinal property? Well, they take on more significance with transfinite numbers. Some have suggested that ancients were confused between the cardinal and ordinal properties of numbers. They point out linguistics of numbers suggesting that numbers were used sometimes as 'same size', and sometimes as place value, like the sixth place. This is the meaning of ordinal numbers. But, George Cantor got into infinit sets by examining fourier functions. Fourier series represents functions by infinit series. George Cantor wanted to cut out all sets of infinit series that didn't need to be there; he counted them by ordinal sets, and found that he was counting past infinity.
George Cantor then examined these infinit sets apart from the analyses. I don't know if someone pointed out the Galileo one-to-one correspondence, but he started with that. He then found a diagonal argument to be able to make a one-to-one correspondence of natural numbers and even integers to the rational numbers. He found that it can be done! Hence! Even the fractions are the same size infinity of the natural numbers. How about the real numbers(all the numbers, natural, rational, and the irrational numbers combined)? One way of seeing this is a table,
Imagine like a multiplication table, only we're not doing multiplication. On the left side going down are subsets subscripted s1, s2 . . . on down. On the top row of the table is the natural numbers off to infinity. In the body of the table for each subset is coded up as either a 1 or a 0. The 1 or 0 are indicating whether the natural numbers are to be included in the subsets. Now, imagine an S row at the bottom. If you include as values for this S row the values switched from either 1 to 0, or 0 to 1 from the values indicated in the diagonal of the table, then this is a transcendental number for one. And, it's a number that is not in the set of real numbers. It's a higher infinit number.
George Cantor symbolized his transfinite numbers by a letter called Aleph. It's a jewish letter. But, he went on to show that aleph plus or even times aleph equals the same size infinity! But! Aleph to the power(exponent) aleph creates a new size infinity!
There's history and mathematics of transcendental numbers that can be described but are not essential - such as e, the exponential number, and pie, the circumference of the circle. These were proved to be transcendental in the 1800s. Pi of course is a problem that goes back to Archimedes. He certainly found a way of calculating it(a mathematical story I describe in my entry about the mechanical universe episode about circles; I'm hoping to redo some of that soon enough!); but, a major point is that this number is not a solution of algebraic equations - hence the definition of transcendental numbers. More about the amazing network of ideas George Cantor was working in when he came up with his transfinite numbers is how Cantor introduced the height of a algebraic number. An algebraic number is generally expressed as a general form of a polynomial(I can't exactly represent even this much with the typing tool I have). Just imagine a polynomial in all terms, with the exponents as a x variables. Now, Cantor took the absolute values of these algebraic numbers; this puts them in classes. This listing creates a finite set of numbers for each height, hence they can be listed by the diagonal number; hence, they're equal the natural numbers infinity and not the continuum of the real numbers. The transcendental numbers fit more densely on the continuum than the algebraic numbers!
The question becomes, "what's the relation between the ordinal and cardinal transfinite numbers?" A growth function is defined by a rational function: G(x)/F(x) to infinity. This says G(x) grows faster than F(x). A discovery of a Du Bois-Reymond. Further, Cantor showed that Infinit ordinal numbers are the least upper bound of finite ordinals. Each ordinal has a successor, w+1, w+2 . . . Beyond them is w2, and beyond w2 is w^2(w to the exponent 2), and beyond w^2 is of course w^w. The ordering of infinity gets mesmorizing! George Cantor defined w by growth function as defined above! He also saw that the growth functions are defined by the diagonal argument which define the first w.
Infinit ordinals can be well ordered. A simple proof is "take an element and compare; either way, one is bigger than another." Here we see that cardinals can be measured by infinit ordinals. The question is "how can we pick an ordinal in these infinities? This leads to the infamous "Axiom of Choice." The use of the Axiom of Choice or not has their advantages and disadvantages. There's been some mathematicians recently suggesting they've solved this problem. Hence, why I'm hoping to wright this article up here. Axiom of Choice: For any function X of non-empty sets S, there is a function Choose(s), such that choose(s) is in a set S for each set in X. Zermelo in 1904 proved that the axiom of choice implies the well ordering of infinit ordinals.
We know that transfinite numbers can be measure by infinit ordinals, but do they equal the continuum? This is the continuum hypothesis. The continuum hypotheses can prove two of three statements, 1) can the current process for generating ordinal numbers(taking successors and least upper bounds) be continued uncountably far? 2) If so, can we "exhibit" an uncountable set of positive integer functions, ordered by increasing growth rates? 3) do the corresponding growth rates eventually exceed that of any given positive integer function? Number two is impossible to exhibit uncountable set of positive integer functions; but, we can prove it by the continuum hypotheses. It creates a scale, or sequence of functions.
If we generalize induction by least upper bounds, we make an ordinal generalization of the induction process. And remember, we made, or Du Bois Reymond above did, found connections between the finit and the infinit through infinit ordinals(this is the importantce of ordinals over cardinals). These least upper bound/ordinal generalization of the mathematical induction process defined the w, w+1, 2w and so more rigorously.
Infinit ordinals stretch the mathematicians ability to wright down mathematical ideas in symbols! There's a number, an backwords capital E with a subscript 0, and all infinit ordinals less than it can be expressed by these cantor normal forms(the least upper bound mathematical induction process described a little bit above). The cantor normal forms look like a tri(3) inequality of infinit ordinals, like w^w<a<w^w^w. Mathematicians are hoping that this E(subscript0) measures the complexity of arithmetic.
A generalization of the cantor normal forms is Goodstein's theorem. A number can be represented as a base to a exponential powers of the numbers factors. Goodstein noted that ordinals of a cantor normal form can be replaced by these exponents. The reexpression of a number by cantor normal forms, the substitution and then another reexpression into a cantor normal form process is finit because ordinals as described above are well ordered!
Why is all this infinit(transfinite) numbers important? Because in 1931, Kurt Gödel published some theorems about logical systems. He said, if a finite set of axioms are consistent, they cannot prove an infinity of truths. To prove those, you'd have to add another, and another. Some mathematicians(and philosophers of mathematics like me) are fine with that. Others are not. And, it's a hugh fight for the rights to the mathematical heavens! I just point to my Jacob Bronowskian ideas and say none of this is the real point of the nature and origin of mathematics period! Getting back to Kurt Gödel . . . Gödel proved his theorems by means of 'Gödel numbering' and some previous theorems that are even harder than Gödel's theorem. He 'godel numbers' everything, 1, 2, 3, logical symbols for and/or/inference. Then, he shows through factoring or statements like "this theorem is true" that they don't factor. This seems pretty artificial, and maybe it is. I maybe should say that Gödel's theorem is a more elaborate version of the Cretan liars paradox. If you try to prove that "this statement is false", you get a contradiction if you call it true, and if you call it false. If it's true, then this statement says it's false! If you say this statement is false, then, you're saying that the statement that it is false, is false! One more interesting tidbit that I've found in my biblical explorations(for those who have followed me, or are willing to follow some more other things if this is the first article you read from me) is that In a New Testament epistle of Titus I do believe. Yes, 1:12, "One of themselves, even a prophet of their own, said, The Cretians are alway liars, evil beasts, slow bellies." Ope, in looking for this Titus(as in Titus, son of Roman emperor Vespasian; the Roman Emperor that 'Flavius' Josephus says is the coming Messianic savior of the Jews at the end of his Jewish War book; see the first post of my blog, "The Gospel of Truth"), I found another, found by none other than Saint Jerome(5th century A.D.), Psalms! "I said in my alarm, 'Every man is a liar!' "(Psalms 116:11) Is David telling the truth or is he lying? If it is true that every man is a liar, and David's statement, "Every man is a liar" is true, then David also is lying; he, too, is a man. But if he, too, is lying, his statement: "Every man is a liar," consequently is not true. Whatever way you turn the proposition, the conclusion is a contradiction. Since David himself is a man, it follows that he also is lying; but if he is lying because every man is a liar, his lying is of a different sort." - Saint Jerome(reference "St. Jerome, Homily on Psalm 115 (116B), translated by Sr. Marie Liguori Ewald, IHM, in The Homilies of Saint Jerome, Volume I (1-59 On the Psalms), The Fathers of the Church 48 (Washington, D.C.: The Catholic University of America Press, 1964), 294"). Well, Gödel's proof is a bit like the Cretan liars paradox; we won't be doing that here. There's kind of a simpler proof; but, I will as usual, only describe it here.
Emil Post came up with a simpler proof; he distinguished formal and normal systems. A formal system is a list of axioms and rules of inference(which bugs out the intuitionists philosophists of the mathematical foundations wars; see my points about constituent and logical relations above!) A normal system is simply a subset of the formal system in question! Emil post applied the diagonal argument to this conception. Getting back to the backwards E(subscript 0) measureing complexity. In terms of Gödel's second theorem, a set of axioms(finite) cannot prove it's own consistency(ouch!). But, a major point here is that Gerhard Gentzen, in 1936 that the complexity of a formal system is measured by this E(subscript0). As I'm going through my notes, the E is pointed in the right direction! The E as used by mathematicians is a more rounded E.