Friday, May 30, 2014

astro picture for the day

Image Credit: Hubble Legacy Archive, NASA, ESA - Processing & Licence: Judy Schmidt

- I post Arthur C. Clarke's "Riddle of the Stones" from his Mysterious world video series; it features La Grange and of course Stonehenge. I've found another great Britain/Ireland Neolithic architecture,

La Hougue Bie .

Here's a British humor youtube of La Hougue Bie.  La Hougue Bie is not on the British or Ireland island; it's on a smaller island west of France.

- some dna-nanotech news,

The New York based Nadrian Seeman led team reports patterning of proteins.

"Fibrils are remarkably strong and, as such, are a good barometer for this method's ability to form two-dimensional structures," observes Seeman. "If we can manipulate the orientations of fibrils, we can do the same with other linear materials in the future." I would think this means a general ability to make nanomechanical parts out of proteins/peptides(a kind of artificial protein that can do more things than proteins can).

- SpaceX news,

I'm a little late here; but, Elon Musk hosts the unveiling of his Dragon V2 stage of his rockets.  The V2 will be able to get humans out to space, and he even suggests Lunar missions.  So, he says the first flights for DragonV2 are around late 2016; so, Lunar missions a little bit after that?  I don't know; but, I'd guess that's the plan.  Here's the video,

What's really exciting about the Dragon V2 is the rockets are 3d printed!  Really, this 3d printing just shows a faint hint of coming abilities both technological in general and Space access in particular!  As he says, the rockets of Dragon V1 are like 100 pounds of thrust; the rockets on the V2 Dragon are 16,000 pounds of thrust!

astro picture for the day / C.H. Chapman quote

Image Credit & Copyright: Subaru Telescope (NAOJ), Hubble Space Telescope;
Processing: Robert Gendler &Roberto Colombari

"there is probably no other science which presents such different appearances to one who cultivates it and one who does not, as mathematics. To[the later] it is ancient, venerable, and complete; a body of dry, irrafutable, unambiguous reasoning. To the mathematician, on the other hand, his science is yet in the purple bloom of vigorous youth." - C.H. Chapman

astro picture for the day / Lao Tzu quote

Image credit: NASA/JPL-Caltech/2MASS. Credit: NASA/JPL-Caltech/2MASS (Space orbiting infrared telescope Spitzer)

"Knowing others is intelligence;
knowing yourself is true wisdom.
Mastering others is strength;
mastering yourself is true power."
— Lao Tzu

Monday, May 26, 2014

astronomy picture of the day

ESA/NASA Hubble Space Telescope image

- Science/Tech news for the day,

Hao Yan from Arizona university(and a protégé of dna nanotech founder Ned Seemans) reports using dna-nanotechnology to make enzymes outside of a living cell work. While this work is for energy purposes, as he says, his work could be generalized to more general purpose nanomanufacturing,

""An even loftier and more valuable goal is to engineer highly programmed cascading enzyme pathways on DNA nanostructure platforms with control of input and output sequences. Achieving this goal would not only allow researchers to mimic the elegant enzyme cascades found in nature and attempt to understand their underlying mechanisms of action, but would facilitate the construction of artificial cascades that do not exist in nature,"

Multi-enzyme complexes on DNA scaffolds capable of substrate channelling with an artificial swinging arm

Enzymes are proteins; and protein folding is still not quite solved in a general way.  But, molecular biologists have made enough inroads to be able to reliably design and make proteins at will. The protein folding problem is considered the quantum gravity problem of biology(there's also the role of dynamical systems/chaos theory; some of that imo has been solved by a Stuart Kauffman).  So, if this breakthrough can indeed be generalized to an initial biological nanomanufacturing system, this is a treamendous accomplishment before even the daimondoid nanomanufacturing!

- Molecular biologists mapped the dna of an arbitrary human genome from 1990 to 2003.  Dna science has advanced so much that I got my dna read for a hundred dollars a month or so ago. I expected some interesting things.  I grew up hearing that have American Indian blood; we have pictures of our Native American ancestors on our walls!  But, the dna reading reveals I have known!  What happened? I haven't bothered to get a straight answer out of my grandmother; but, it seems clear that she lied!  Well, she probably does; but, she was probably infertile; she got my father through adoption and the rest is history!  Nobody asks questions in my family about it; and, when everyone meets, we talk about whateve as if nothing had happened. In other news of the dna reading my own personal genome . . . I'm 2.3 percent Neanderthal!  and, I'm somehow related to Thomas Jefferson on one side of my family and Petrarch(this guy ) on the other. So, I guess that's kind of cool! Getting to why I point out the Human genome project,

When the human genome project was finished, molecular biologists knew immediately that they were far from done; they needed to map out the proteins.  This latest article suggests we're pretty close.

There's approximately 18,000 proteins.  What this study has really revealed is the dna/rna/protein intereactions. Some of what's revealed is that rna has encoded in them how many times a protein is made in a given cell tissue.

About 2,000 proteins  appear to be missing.  The molecular biologists suggest that some of the missing proteins are due to evolving out things that are not need anymore such as olfactory, or the sense of smell.

- I'd like to point out an idea that has recently come to me in terms of anthropology. I've thought that what makes humans unique compared to other life is that we can act like anything.  Biological mimicry is rampant in the natural world; bugs that look like twigs(I've seen one of these just a few months ago), flys that look like wasps, and so on.  But, they are still just that one mimicry.  We are able to mimic more than one thing; and, we're able to think about these things.  Seems to me that this mental ability mostly took hold in humans about thirty thousand years ago.  Human around then started following animals around; they started domesticating animals.  How did they do is?  They mimicked. And of course, the cave paintings show that the dominant concern of their's was animals.

The history of what animals and planets were domesticated when could perhaps be considered a history of  early technology for humans. Dogs go to 30,000 years ago. Sheep - 11,000 years ago.  Bees might have been domesticated 13,000 years ago. Cats domesticated around 7,500 years ago. Chickens were domesticated around 6,000 years ago(the first bird), rabbit 600 A.D,

Tuesday, May 20, 2014

thought for the day/the relation between the finite and the infinit

"there is no truth, if there were it could not be comprehended-and if comprehended, it could not be communicated" - character in a Plato dialog

As I mention in my previous post about Celestial Mechanics and Chaos theory, chaos theory shows the idea that modern science is defined by data collecting and putting into Cartesian coordinates does not define science is wrong; but, that doesn't mean mathematics is wrong.  Similarly, mathematicians talk about foundations of mathematics.  But, these foundations of mathematics are not about the origin and nature of mathematics, but issues that comes after mathematics is created.  The mathematicians concern over the foundations of mathematics are about what logic is the foundation of mathematics. Well, some argue whether logic is the foundation of mathematics such as set theoretic foundations.  But, most of these foundation wars are what sets of logical axioms are a foundation of mathematics.  There's the Bertrand Russel Logicist.  David Hilbert had his axioms stripped of all meaning.  The intuitionist/constructivists are not stricktly against logical axioms, but whether certain axioms of the Russel/Whitehead "Principia" are valid.

Bertrand Russel and Alfred North Whitehead produced a massive three volume I do believe tomb of the foundation of mathematics.  They tried to show that one could derive all mathematics from symbolic logic itself.  They derived much of symbolic logic, then set theory and then numbers in terms of all this symbolic logic. Mathematicians love to exhibit the Russel/Whitehead passage proving that 1+1=2,

But nowhere in Bertrand Russel and Alfred North Whitehead's Principa Mathematica, or of any of the foundational schools is there any discussion of abstraction, idealization, much less Jacob Bronowski's vast generalization of how abstraction works in mathematical knowledge.  There is Suzanne K Langer's "Introduction to Symbolic Logic" however.  She does show how symbolic logic can be derived from language and abstraction; but, once again, she is not cognizant of Jacob Bronowski's "Origin of Knowledge and Imagination."

Still, mathematicians have created a massive amount of theory of logic; some would call it mathematical logic.  This great out-pooring of logic actually goes back to Boole in the early 1800s.  But, the 1900s were no less exciting as I hope to show.

The path I'm going to show I would say actually goes back to Grassman and Dedekind.  John Stillwell has been championing this throughout all his books.  I would include the name of Poincare in this as well.  As E.T. Bell mentions in his "Development of Mathematics", Poincare suggested that mathematical induction can provide a foundation of all of mathematics.  E.T. Bell's refutation wasn't much. Grassman and Dedekind showed that numbers and some what were previously considered axioms of numbers and arithmetic, commutativity and associativity, can be derived by mathematical induction. This suggestion was passed over by mathematicians for a long time.  As far as I can tell, John Stillwell is the real champion here.

John Stillwell has tried to write a series of books that relates ancient mathematics to modern mathematics.  In so doing, he tries to show what mathematics of the past is worth learning. I disagree a little bit. Sir Thomas Heath made editions of Euclid's Elements where he puts as notes the findings of Proclus(5th century A.D.; the last mathematical light till the dark ages were over) commentary.  The connections and hence importance of the various theorems throughout Euclid's Elements to the Pythagoreans and Thales are just so great, I cannot agree that Mr Stillwell's sifting should forever shut all the rest of Greek mathematics in the past.  For instance, Heath shows there's connections between the great Thales theorem about every angle in a semi-circle is a right angle and how every triangles angles sum to 180 degrees.  He further shows in book six, theorem 11 I do believe, well, the theorem is the geometric interpretation of finding the square root of a number.  The theorem uses Thales theorem about every angle in a semi-circle is a right angle. This theorem further shows that the great Thaetateous book 11 about making a vast generalization of roots in terms of infinit series(the three kinds, arithmetic/geometric/harmonic) is a valid one. Eudoxus goes on in the books about the Pythagorean solids to use this mathematics to calculate the angles of the Pythagorean solids mathematically and not empirically! This is pretty much the big accomplishment of Euclid's Elements. There's others like Eudoxus's book five, which is about proportions and essentially creates a geometric version of the real number system. I mean to suggest here the great vision of Greek mathematics.  Like the alien arithmetic that is certainly antiquated today, but was a tremendously creative effort back then, Greek mathematics is certain antiquated as far as today's more general accounts; but, they work and are great creative efforts.

John Stillwell is also hoping to be able to teach advanced mathematics from understanding ancient Greek mathematics; I definitely disagree there. Modern mathematics solves ancient mathematics; modern mathematics sheds light on ancient mathematics and not the other way around.

Getting back to John Stillwell . . . he shows in almost all his books this mathematical induction derivation of numbers. In the 1800s, mathematicians found they could explain numbers like never before in terms of set theory. This should be familiar to you.  Two sets of oranges and two sets of apples are concrete representations of the number two. But, a logician/mathematician Peano showed that there are number systems defined by the usual axioms of commutativity, associativity, and the distributive law.  And, that we can distinguish the natural numbers from other number systems by another set of axioms called the Peano postulates. Off the top of my head, this is every number has a successor, each successor leads to a new successor. John Stillwell(Grassman/Dedikind before him) shows that this successor function defines mathematical induction.

-If 0 is a S and if n+1 is a S when n is a S, then N(symbol for natural numbers) is a subset of S.  There's a mathematical symbol for "is a" that I can't duplicate here!  Also, mathematicians will point out that one must distinguish between "is" and "is a"; see Suzanne K Langer's "Introduction to Symbolic Logic" for a great account of this and much else besides! . . . two more axioms of mathematical induction . . .If 0 is a S, and if n+1 is a S when 0,1 . . .,n is a S, then N is a subset of S, and the third rule, If T is a subset of N is nonempty then T has a least member.

John Stillwell proves that these can be proved equivalent.  Once again, proving things opens up more mathematics.  Proving one way leads to certain mathematics, proving another way leads to other mathematics.  In Algebra, Galois disproof of the impossibility of a quadratic formula, or a closed form solution, for the fifth degree is generally impossible leads to group and field theor.  Abel's proof leads to elliptic functions and topology(a revelation of the 20th century). John Stillwell later shows(not important here) that these three rules have importance at different times.  Lets get to defining addition by mathematical induction.

m+0=m for all m 'is a' N . . . this is the 0 base induction step
m+(k+1)=(m+K)+1 for all m,k 'is a' N . . . this is of course the n+1 induction step.  So we went from equally just m, to m+k.  Addition is defined.

Associativity proof, base step n=0: L+(m+0)=L+m since m+0=m by the definition of addition proved above . . . if you zero the third term, you get the same thing (L+m)+0 lol!
the Induction step for the Associativity proof; n=k+1,  l+(m+(k+1)=L+((m+k)+1 . . . by definition of + above,
=(L+m) + (k+1) notice k+1 which we started by substituting for n shows up again!

Commutativity, If you zero out n(base step), you get m=m! m+(k+1)=(m+k)+1 , this is by definition of addition, then = (k+m)=1 by the assumption; the assumption is substituted.  Then (k+1)+m by associativity.

- And so numbers and basic properties can be proved by mathematical induction. This is all fine, but why take this over say set theory? Mathematicians through the twentieth century have developed mathematical logic and generalisations of logic in terms of mathematical induction. But, this is getting ahead of our story.

While George Boole's algebraic logic led to Bertrand Russel and Alfred North Whiteheads "Principia", and much other symbolic logic developments(like Post's three and even more valued logics), Kurt Gödel used a diagonal argument(mentioned in my previous post about Celestial Mechanics) to prove some incompleteness and consistency theorems.  Kurt proved that a finite set of consistent axioms cannot prove an infinity of truths(the goal of Bertrand Russel and Alfred North Whitehead). Gödel's proofs showed the relation between logic and infinity is harder than mathematicians had hoped. One could instantly remark that 'of course' the relation between logical proof and infinity has not been explored enough.  But, I'd like to note that mathematians would point out that the necessity for proof is to deal with cases where one can't check every case, whether the set of cases in just very large or infinit.  But, proof does more than this.

Logical proof questions assumptions, and in the process reveals hidden structure. Right off the top of my head, like in Euclid's Elements, there's a series of like four theorems.  The Greeks here were proving that a triangle is 1/2 times base times height.  But, in proving, they found some more phenomenon.  They found that areas stay the same no matter how inclined an angle of the triangle or even a square.  The easy proof of the area of a triangle is is to divide a square or a parallelogram and say, see!  the triagle is half the square and the squares area is base time height. The Greeks found four cases of triangles under the same base and height and parallelograms under the same base and height, or equivalent bases. I'd like to point out some new thoughts I've had about my "Nature and Origins of Mathematical Knowledge article".

Suzanne K. Langer points out in her "Introduction to Symbolic Logic", how abstraction works.  That of structures.  A structure is a relation between elements(two or more, but usually just two).  A structure can have its relation held the same while the elements change.  Or, one could change the relations between the elements.  The first case leads to abstraction.  She calls the relations that make up these structures/abstractions/concepts, constituent relations.  People will say to prove anything, one needs some concepts formed first.  That there's some kind of creative process before logical proof.  Maybe, and here we have it - constituent relations. But then, what about relations between these propositions?  She suggests these are your logical relations. This point of constituent and logical relations could generalize Jacob Bronowski's ideas about inferred units.  As Jacob Bronowski himself says, inferred units could be had by either specialization(his word for idealization), or generalization(which leads to abstraction). Why not logical proof leads to new ideas? An idea is expressed as a propostion.  Logical proof leads to ways of transforming from one way of viewing things to another(kind of like being able to see that our current perspective leads to a flat earth, but this leads to problems like retrograde motion of planets), or reexpressing one idea in terms of another, and that's an act of creativity.  The Greek discovery of irrational numbers is of course the famous example of discovery by logical proof. 

Another example of creativity by logical proof, and how new processes of mathematical discovery reveal themselves as mathematics evolves is that the non-Euclidean geometries of the early 1800s due to Gauss, Bolyei, and Lobochevski. This type of creativity is about changing up one axiom and sometimes more of the Euclidean axioms of Geometry.  In particular, they changed up the Parallel axiom.

So, just because the issue of infinity and logic may not be thouroughly explored, logical proof has it's place and relation with mathematics.  A relation that has been revealed more and more through the history of mathematics. But, let's explore infinity and its affect on mathematics as a whole.

Infinity really goes back to the beginning of numbers.  Whoever first realized that numbers can seemingling be extended to forever must have freaked and believed in god thereafter! As I've pointed out in the beginning of my blog, the Egyptians and Babylonians came up with some creative mathematics to deal with the infinities that numbers lead to. The Greeks started to try to deal with geometric shapes that were smooth instead of easy straight lines, and then Calculus first came in the heads of Isaac Newton and Liebniz.  With the creation of the Calculus, algebraic, trigonometric, and logarithms were unified and handled much more easily than they were before. Galileo as noted in my previous blog entry about Celestial Mechanics found that one could match up say all the even numbers with all the natural numbers(even and odd combined).  That's where matters stood for centuries till George Cantor in the 1800s.

I kind of need to get back to set theory definition of numbers! The set theory definition of numbers uses properties(inferred units) of cardinality and ordinal numbers.  Cardinal numbers are same set. At which point you say why bother with the ordinal property?  Well, they take on more significance with transfinite numbers.  Some have suggested that ancients were confused between the cardinal and ordinal properties of numbers.  They point out linguistics of numbers suggesting that numbers were used sometimes as 'same size', and sometimes as place value, like the sixth place.  This is the meaning of ordinal numbers. But, George Cantor got into infinit sets by examining fourier functions. Fourier series represents functions by infinit series. George Cantor wanted to cut out all sets of infinit series that didn't need to be there; he counted them by ordinal sets, and found that he was counting past infinity.

George Cantor then examined these infinit sets apart from the analyses. I don't know if someone pointed out the Galileo one-to-one correspondence, but he started with that.  He then found a diagonal argument to be able to make a one-to-one correspondence of natural numbers and even integers to the rational numbers. He found that it can be done! Hence!  Even the fractions are the same size infinity of the natural numbers.  How about the real numbers(all the numbers, natural, rational, and the irrational numbers combined)?  One way of seeing this is a table,

Imagine like a multiplication table, only we're not doing multiplication. On the left side going down are subsets subscripted s1, s2 . . . on down.  On the top row of the table is the natural numbers off to infinity.  In the body of the table for each subset is coded up as either a 1 or a 0.  The 1 or 0 are indicating whether the natural numbers are to be included in the subsets.  Now, imagine an S row at the bottom.  If you include as values for this S row the values switched from either 1 to 0, or 0 to 1 from the values indicated in the diagonal of the table, then this is a transcendental number for one.  And, it's a number that is not in the set of real numbers.  It's a higher infinit number.

George Cantor symbolized his transfinite numbers by a letter called Aleph. It's a jewish letter. But, he went on to show that aleph plus or even times aleph equals the same size infinity!  But!  Aleph to the power(exponent) aleph creates a new size infinity!

There's history and mathematics of transcendental numbers that can be described but are not essential - such as e, the exponential number, and pie, the circumference of the circle.  These were proved to be transcendental in the 1800s.  Pi of course is a problem that goes back to Archimedes. He certainly found a way of calculating it(a mathematical story I describe in my entry about the mechanical universe episode about circles; I'm hoping to redo some of that soon enough!); but, a major point is that this number is not a solution of algebraic equations - hence the definition of transcendental numbers. More about the amazing network of ideas George Cantor was working in when he came up with his transfinite numbers is how Cantor introduced the height of a algebraic number. An algebraic number is generally expressed as a general form of a polynomial(I can't exactly represent even this much with the typing tool I have). Just imagine a polynomial in all terms, with the exponents as a x variables.  Now, Cantor took the absolute values of these algebraic numbers; this puts them in classes. This listing creates a finite set of numbers for each height, hence they can be listed by the diagonal number; hence, they're equal the natural numbers infinity and not the continuum of the real numbers.  The transcendental numbers fit more densely on the continuum  than the algebraic numbers!

The question becomes, "what's the relation between the ordinal and cardinal transfinite numbers?" A growth function is defined by a rational function: G(x)/F(x) to infinity. This says G(x) grows faster than F(x). A discovery of a Du Bois-Reymond.  Further, Cantor showed that Infinit ordinal numbers are the least upper bound of finite ordinals.  Each ordinal has a successor, w+1, w+2 . . .  Beyond them is w2, and beyond w2 is w^2(w to the exponent 2), and beyond w^2 is of course w^w.  The ordering of infinity gets mesmorizing! George Cantor defined w by growth function as defined above!  He also saw that the growth functions are defined by the diagonal argument which define the first w.

Infinit ordinals can be well ordered. A simple proof is "take an element and compare; either way, one is bigger than another." Here we see that cardinals can be measured by infinit ordinals.  The question is "how can we pick an ordinal in these infinities?  This leads to the infamous "Axiom of Choice." The use of the Axiom of Choice or not has their advantages and disadvantages.  There's been some mathematicians recently suggesting they've solved this problem. Hence, why I'm hoping to wright this article up here. Axiom of Choice: For any function X of non-empty sets S, there is a function Choose(s), such that choose(s) is in a set S for each set in X. Zermelo in 1904 proved that the axiom of choice implies the well ordering of infinit ordinals.

We know that transfinite numbers can be measure by infinit ordinals, but do they equal the continuum?  This is the continuum hypothesis. The continuum hypotheses can prove two of three statements, 1) can the current process  for generating ordinal numbers(taking successors and least upper bounds) be continued uncountably far? 2) If so, can we "exhibit" an uncountable set of positive integer functions, ordered by increasing growth rates? 3) do the corresponding growth rates eventually exceed  that of any given positive  integer function?  Number two is impossible to exhibit uncountable set of positive integer functions; but, we can prove it by the continuum hypotheses. It creates a scale, or sequence of functions.

If we generalize induction by least upper bounds, we make an ordinal generalization of the induction process.  And remember, we made, or Du Bois Reymond above did, found connections between the finit and the infinit through infinit ordinals(this is the importantce of ordinals over cardinals). These least upper bound/ordinal generalization of the mathematical induction process defined the w, w+1, 2w and so more rigorously.

Infinit ordinals stretch the mathematicians ability to wright down mathematical ideas in symbols! There's a number, an backwords capital E with a subscript 0, and all infinit ordinals less than it can be expressed by these cantor normal forms(the least upper bound mathematical induction process described a little bit above). The cantor normal forms look like a tri(3) inequality of infinit ordinals, like w^w<a<w^w^w. Mathematicians are hoping that this E(subscript0) measures the complexity of arithmetic.

A generalization of the cantor normal forms is Goodstein's theorem.  A number can be represented as a base to a exponential powers of the numbers factors. Goodstein noted that ordinals of a cantor normal form can be replaced by these exponents. The reexpression of a number by cantor normal forms, the substitution and then another reexpression into a cantor normal form process is finit because ordinals as described above are well ordered!

Why is all this infinit(transfinite) numbers important?  Because in 1931, Kurt Gödel published some theorems about logical systems.  He said, if a finite set of axioms are consistent, they cannot prove an infinity of truths. To prove those, you'd have to add another, and another.  Some mathematicians(and philosophers of mathematics like me) are fine with that.  Others are not.  And, it's a hugh fight for the rights to the mathematical heavens! I just point to my Jacob Bronowskian ideas and say none of this is the real point of the nature and origin of mathematics period! Getting back to Kurt Gödel . . . Gödel proved his theorems by means of 'Gödel numbering' and some previous theorems that are even harder than Gödel's theorem. He 'godel numbers' everything, 1, 2, 3, logical symbols for and/or/inference.  Then, he shows through factoring or statements like "this theorem is true" that they don't factor. This seems pretty artificial, and maybe it is. I maybe should say that Gödel's theorem is a more elaborate version of the Cretan liars paradox. If you try to prove that "this statement is false", you get a contradiction if you call it true, and if you call it false.  If it's true, then this statement says it's false! If you say this statement is false, then, you're saying that the statement that it is false, is false! One more interesting tidbit that I've found in my biblical explorations(for those who have followed me, or are willing to follow some more other things if this is the first article you read from me) is that In a New Testament epistle of Titus I do believe. Yes, 1:12, "One of themselves, even a prophet of their own, said, The Cretians are alway liars, evil beasts, slow bellies."  Ope, in looking for this Titus(as in Titus, son of Roman emperor Vespasian; the Roman Emperor that 'Flavius' Josephus says is the coming Messianic savior of the Jews at the end of his Jewish War book; see the first post of my blog, "The Gospel of Truth"), I found another, found by none other than Saint Jerome(5th century A.D.), Psalms! "I said in my alarm, 'Every man is a liar!' "(Psalms 116:11) Is David telling the truth or is he lying? If it is true that every man is a liar, and David's statement, "Every man is a liar" is true, then David also is lying; he, too, is a man. But if he, too, is lying, his statement: "Every man is a liar," consequently is not true. Whatever way you turn the proposition, the conclusion is a contradiction. Since David himself is a man, it follows that he also is lying; but if he is lying because every man is a liar, his lying is of a different sort." - Saint Jerome(reference "St. Jerome, Homily on Psalm 115 (116B), translated by Sr. Marie Liguori Ewald, IHM, in The Homilies of Saint Jerome, Volume I (1-59 On the Psalms), The Fathers of the Church 48 (Washington, D.C.: The Catholic University of America Press, 1964), 294").  Well, Gödel's proof is a bit like the Cretan liars paradox; we won't be doing that here. There's kind of a simpler proof; but, I will as usual, only describe it here.

Emil Post came up with a simpler proof; he distinguished formal and normal systems.  A formal system is a list of axioms and rules of inference(which bugs out the intuitionists philosophists of the mathematical foundations wars; see my points about constituent and logical relations above!) A normal system is simply a subset of the formal system in question!  Emil post applied the diagonal argument to this conception. Getting back to the backwards E(subscript 0) measureing complexity.  In terms of Gödel's second theorem, a set of axioms(finite) cannot prove it's own consistency(ouch!). But, a major point here is that Gerhard Gentzen, in 1936 that the complexity of a formal system is measured by this E(subscript0). As I'm going through my notes, the E is pointed in the right direction!  The E as used by mathematicians is a more rounded E.

Friday, May 16, 2014

thought for the day/Celestial Mechanics and Mathematics

This is the famous 'Mandelbrot set'.  It's a fractal that combines many 'Julia sets'(fractals) into one; it is of course infinit, and there's many videos of zooming in and out; any place leads to an infinity of other fractal forms including pictures of the whole Mandelbrot set that you start out with on the top picture.

"In our haste to press on with new applications and insights, let us not forget from where we came . . . we should reflect on the fact that ideas and properties that appear to have purely physical bases, such as stability and chaos  itself, demand precise  mathematical definitions if they are to be usefully applied." - Florin Diacu and Philip Holmes

Mathematicians have singled out the Navier-Stokes equation as one of the major unsolved mathematical problems of today.  Just a few months ago, a Russian mathematician declared he had made a general proof; then, a little later, Terrence Tao pointed he had made some other progress; I can't claim to know what's true. It takes mathematicians two years or so to confirm some of the biggest new mathematical breakthroughs.

The Navier-Stokes equation is about the dynamics of uncompressible fluids.  Solve it, and you solve perhaps a lot of materials science - not just chemistry, but the properties of materials composed of millions of atoms. The mathematics that has been thrown at it comes from Celestial mechanics which of course, that mathematics goes back to the Greeks Aristarchus at least.  I'll be giving my notes from reading Diacu and Holmes "Celestial Encounters."

Isaac Newton's mechanics of course goes back to the motions of planetary bodies.  The mathematics and science there of course goes back to Kepler, Galileo and Copernicus. Copernicus suggested a sun-centered universe; but, he still had perfect circles and Ptolemaic epicycles(which really goes back to Eudoxus, 4th century Greek mathematician which Plato partly admired and helped out and was a little jealous here and there as well!). The problems he had was not knowing the planets speed up and slow down at different points of their orbit(actually, I think if you read Koestler's "Sleepwalkers", I think he mentions that Copernicus did find something, but didn't know how seriously to take it . . . something very easy to do in at his time). Tycho Brahe of course built enormous sundials to track the movements and take the most accurate data to date. Kepler interpreted the data using the conics mathematics of Appollonius.  This was all done before Galileo found his laws of falling bodies; that different weight bodies fall at the same rate in a vacuum. So, Kepler's mathematical achievement here is treamendous for the time!  And of course, Isaac Newton came up with his inverse square law and actually derived from there the laws of Kepler; this is called the Kepler problem. 

Here's a mechanical universe episode of the Kepler problem I couldn't make the embedding work, so . . .

Isaac Newton also started the idea of deriving what atoms an astronomical body is made of just by calculating its motions, or the motions of moons around it.  After Isaac Newton, the mathematics of differential equations, calculus of variations of a hundred years since his time . . . from people like Leondard Euler, the Bernouli family, and of course Laplace and Lagrange . . . led to Laplaces's five volumes almost a thousand pages 'Celestial Mechanics.'  It was outdated shortly by Hamilton(who innovated quaternions), and then Poincare(who systematized topology, came up with automorphic functions which combines elliptic functions with group theory and some other mathematics to solve celestial mechanics).

Mathematical physicists after Newton's time came up with conservation laws to calculate problems that involved more than two bodies.  Newton also apparently looked for global solutions.  This was a vectorial flow field.  I've done the Kepler problem from a calculus book that is different than what's done in the mechanical universe video above.  When checking out Newton's principia, I noted that the Kepler problem is solved in chapter one, and then he does a lot of his fluid dynamics and the not exactly rigorus treatment of determining the compositon of astronomical bodies by their bodies.  I knew this, so I just read the first and last chapters and called it good.  Now, I see I should probably check out some more.  Also, I've learned that some of the other theorems proved in chapter one have other significance.  First the theorem . . . lemma 28/chapter 1 has it that "No oval figure exists whose area, cut off by straight lines at will, can in general be found by means equations finite in the number of the terms and dimensions."  This is one way of showing a transcendental number. The numbers pie and e were proven transcendental hundreds of years later in the 1800s.  It of course comes out of celestial mechanics considerations.  This just gives some more appreciation for Newton's 'Principia.' Some of the other major items of the Principia are the calculation of the precession of the equinoxes, the center point theorems.  Newton found that to apply his mathematics to studying the motions of the planets, he needed to prove that all he had to take into account was the center point of the celestial body.  He did the sphere, but then he also solved the oval shaped figure as well! There's also the classification of conics, and the use of a different curve for clockmaking than the conics.  It's kind of parabolic but not quite a parabola.

Isaac Newton came up with much more mathematics.  A twentieth century scholar wrote them all up in eight large thousand page volumes. I've only seen the first volume. It's this hugh book! It's like two feet long! His mathematics and Liebniz on the calculus opened up a floodgate of mathematics for the mathematicians of the 1700s.  The main mathematicians have been mentioned already - the Bernoulli family(there was eight Bernoulli's in this mathematical family).  They learned from Liebniz, they taught Leonard Euler.  I have a pretty good video speech about Leonard Euler earlier in this blog posted.  Then there was Lagrange, who did lots of good Number theory(so did Leonard Euler) after Fermat, and then Laplace. These and some others expanded the amount of what's called analyses.  Analyses is the mathematician's name for anything calculus - differential equations, calculus of variations, complex analyses, and later, most in the 1800s, real analyses. Mathematicians from the birth of the calculus and analyses hoped to make closed form solutions to differential equations just like they did for algebraic equations(the famous quadratic formula, which goes back to the Babylonians of almost 2000 B.C.). There has been some headway in this direction in the work of Picard of the late 1800s and Sophus Lie.  Part of the inspiration for Sophus Lie to make his lie algebras was to do exactly this. There's been work done even beyond Picard's first success.  I'll get back to what's pointed out in "Celestial Encounters" now.

I think it was Poincare who came up with 'phase space.' I like to think of phase space as a kind of polar form. It's kind of like a Cartesian plane, but it it's a global picture of the energy/postion possibilities. Poincare I'm thinking showed that he could solve differential equations by a phase space diagram of the Newtonian vector flows.

Now we get into some more purely mathematical considerations. There's existence and uniqueness proofs. Mathematicians tried to prove these dynamical systems.  And from a logical proof perspective, they'd make distinctions between existence and uniqueness solutions.  They came up with examples of dynamical systems that violated either existence and others for uniqueness.  If two curves meet at the same point, they violate uniqueness.  To prove existence/uniqueness for these dynamical systems is what's called the navier-stokes equation problem.

In studying the existence/uniqueness proofs, subtle 'inferred units' emerge(see my Origin of mathematical knowledge article, third to first article of this blog).  There's global and local existence, continuity.  Continuity is generally defined 'locally.'  When defined globally, it's called 'stability.'

Differential manifolds are a generalization of manifolds which comes from complex analyses.  Manifolds, or topology comes out of complex analyses Jacob Bronowski 'inferred units' style.  It's one of the biggest such mathematical events of recent mathematical history. Differential manifolds are roughly defined as when a surface can be locally approximated by a plance.  Wildly different surfaces can then be considered equivalent locally. A torus and a sphere for instance.  Differential manifolds reduce the amount of variables, or dimensions of a differential equation - hence helping to solve them. This process is related to conservation laws.

What Poincare was trying to solve that led to much of the above was the N-body problem. The N-body problem is the most general statement of the three body problem which Newton of course could not solve.   It's asking for a global solution to the three and N-body problem. Differential calculus is more or less local, integrals are more or less global; Poincare found integral invariants in trying to solve the N-body problem. Poincare's exploration of the three and larger body problem led to what's called 'Chaos theory.'

James Glieck remarkes, I think it's more of a quote, in his "Chaos" book that quantum mechanics eats at Newtonian mechanics on the smallest scales.  Einstein's theories of Relativity modifies Newton's mechanics at the largest scales.  Chaos theory modifies Newtonian mechanics at the scales of the universe mathematical scientists thought was well established; the scale of the universe humans experience every day. Chaos happens in between stable 'Newtonian' periods of a dynamical system.  On a phase space, a stable or equilibrium point is represented by a single point. Set the system in motion, and chaos potentially can happen, and then it dies down to an equilibrium point.

There's all kinds of inferred units that Poincare came up with in studying this chaos(which he called a strange attractor). Homoclinic points, and intersecting curves.  There's a famous theorem of Poincare's; that there's two stable points on a ribbon where part of it is going in one direction and the other part is going in another direction. This led to the famous(amongst mathematicians and chaos theorists) Smale horseshow map. Smale imagine folding surfaces onto one another.  This folding creates a horseshow pattern.  Starting with two points, the two points chaotically drift from one another. They often get close together as well. Getting a little off topic for a moment.  There's a prime distribution problem.  Leonard Euler came up with these infinit series(really an equality of infinit series and infinity products) that somehow seems to calculate the distribution of primes; there's much more to say about this.  I find that there's twin primes, and I've often felt that the distribution of primes is fractal and seems similar to what you see in this horseshow map(which is fractal).  Getting back to Celestial mechanics and all . . . Smale made an equivalent expression of his horseshow map in terms of cantor dust.

Cantor dust goes back to George Cantor.  George Cantor came up with transfinite numbers; or actually infinit sets of numbers. This goes back to Galileo actually.  Galileo, in his "Two New Systems" found that one could make a mapping of the say even numbers to all the numbers; one could do likewise for the odd numbers to all the natural numbers.  George Cantor found ways to do this for rational numbers, and all the negative integers.  This was by means of a diagonal argument.  George then found that he couldn't do this for real numbers.  The set of real numbers are a larger infinity than the rational numbers, or the natural numbers.  And, George found this process can be done to infinity and then you can start over and go to infinity again!

Cantor dust has to do with taking out thirds of  a line segment, or even a third of every line segment unit off to infinity; If you sum the segments removed, they equal 1! Since the segment equals 1, the cantor dust equals 0!  Going off on another tangent for a moment, this leads to measure theory and Lebesgue integrals, and 'real analyses.' They eventually are able to derive the fundamental theorem of the calculus from this new higher analyses, just like Newton derived the Kepler laws described above; and Newton's laws are derived from Einstein's General Relativity. Cantor dust is a fractal.

  In this way, Smale relates the solution of differential equations to fractals. Things get far more technical; Smale relates his horeshow mapping to Poincare's homoclinic points(intersecting curves). This is kind of the celestial mechanics origin of Chaos theory.  There is of course more.

A Kolmogorov studied the chaos theory of Hamiltonian systems.  Hamiltonians are a generalization of Lagrange's dynamical systems theory, which is a reformulation of Newtonian mechanics in terms of calculus of variations. I'm certainly not explaining Hamiltonian equations. But, Kolmogorov studied the chaos theory of Hamiltonian systems. He cut out dynamical systems manifolds of quasi-periodic motion.  This quasi-periodic motion looks like a torus.  There's integrable and non-integrable hamiltons.  Integrable hamiltonians lie on the torus.  Non-integrable hamiltonians are when the tori are broken up.

There's a variety things I didn't fit in the above account of the celestial mechanics origin of chaos theory and relations to the Navier Stokes problem.  Vladimir Arnold solved Hilbert's 13th problem which relates to KAM theory(the work of Kolmogorov), Euler found special phythagorean triangle solutions to the three body problem, Bifercations comes from Pontryagin, Liapunov exponents, Zhihong Xia solved a Painlev's conjecture about singularities.  These singularities are about whether systems collide or not; they work out geometric diagrams that you would not believe for what sounds like something totally trivial.

A philosophical point brought up by Diacu and Holmes needs to be addressed.  I've noticed this while reading James Glieck's "Chaos" as well.  They both argue that chaos theory leads to a non-mathematical science; that mathematics is somehow the wrong approach. It certainly changes some of how scientists makes graphs of data, and say whether a theory is proved right or not based on those data graphs. But, my point is that chaos is about strange attractors, and strange attractors are abstractions just like the number two.  Yes, we have a new abstraction.  Chaos is a new science in the same way that quantum mechanics and General Relativity are new sciences.  The goal becomes how does chaos theory relate to Newtonian mechanics just like Newtonian mechanics is derived from General relativity and Kepler's laws are derived from Newton's inverse square law.  And the major problem of physics today is how are quantum mechanics and General relativity derived from one another?  Which derives the other?

On the technology side, chaos theory had led to the technologies of going from stable to chaotic states; a strange attractor is a the whole of what could be many different stable patterns.  Chaos theorists have been able to switch from one stable state to another of a given strange attractor!   In any system whether electrical, mechanical, or chemical!  This technology is generally not talked about as much as nanotechnologies and quantum computers; but, there is an already rather large literature and capability.

And then of course, if we solve the Navier Stokes equation, we get a materials science opened up as great as what nanotechnology can do for chemistry!

Monday, May 12, 2014

astro picture for the day / Boris Pasternak quote

ESA/NASA Hubble Space Telescope image

"What is laid down, ordered, factual, is never enough to embrace the whole truth: life always spills over the rim of every cup." - Boris Pasternak

- Science/Tech news extra,

Biologists have found a way of dealing with bacteria without drugs.  Bacteria have been evolving drug resistance to anti-bacterial drugs. 

Chemists design molecules for controlling bacterial behavior

Sunday, May 11, 2014

astro picture for the day / Hermann Hankel quote

Image Credit: Optical: DSS; Infrared: NASA/JPL-Caltech;
X-ray: NASA/CXC/PSU/ K.Getman, E.Feigelson, M.Kuhn & the MYStIX team

"In most sciences one generation tears down what another has built, and what one has established another undoes. In mathematics alone each generation adds a new story to the old structure." - Hermann Hankel

Wednesday, May 7, 2014

astro picture for the day

Credit: X-ray: NASA/CXC/PSU/K.Getman, E.Feigelson, M.Kuhn & the MYStIX team; Infrared:NASA/JPL-Caltech

Quote for the day,

"philosophy is questions that may never be answered
religion is answers that may never be questioned" - wreck ship(well, I got this from a football messageboard!)

- Nanotechnology extras,

Synthetic biologists have made bacteria that can manufacture synthetic protiens

Here's three dna-nanotechnogy videos, thirty minutes or more each.  William Shih shows and mentions just about all the major dna-nanotechnology breakthroughs so far. The third lecture covers medical applications(about all that dna-nanotechnology can do right now).

Saturday, May 3, 2014

astro picture for the day / T.S. Eliot quote

Image Credit & Copyright: Bill Snyder (at Sierra Remote Observatories)

"We shall not cease from exploration, and the end of all our exploring will be to arrive where we started and know the place for the first time." - T.S. Eliot

- Science/tech news extra,

Neuro-chips are advancing well.  These will be little more than faster computers.  These neuo-chips could be manufactured and sold for four hundred dollars in the near future.