[[[Former Title: Mathematical Philosophy — Philosophy Done Mathematically]]]

Metaphysics, Metamathematics and Metabiology
A mathematical analysis of the scientific method, the axiomatic method, and Darwin's theory of evolution

G. J. Chaitin, IBM Research

In Memoriam Jacob T. "Jack" Schwartz (1930-2009)

[Outline of a course given August 2009 in the Programa de História das Ciências e das Técnicas e Epistemologia (HCTE) at the Universidade Federal do Rio de Janeiro (UFRJ), to be published as a book in Spanish by Midas (Valparaíso). This was a three-week course, with two two-hour lectures each week. There are links to handouts and also additional reading given for each topic.]

Sans les mathématiques on ne pénètre point au fond de la philosophie.
Sans la philosophie on ne pénètre point au fond des mathématiques.
Sans les deux on ne pénètre au fond de rien. — Leibniz

[Without mathematics we cannot penetrate deeply into philosophy.
Without philosophy we cannot penetrate deeply into mathematics.
Without both we cannot penetrate deeply into anything.]

First Week: Metaphysics

Handout #1, Handout #2, Handout #3, Handout #4

Lecture 1 — Epistemology: What is a Scientific Theory?

Lecture 2 — Ontology: The Labyrinth of the Continuum

Additional Reading:

Second Week: Metamathematics

Continue using Handout #1, Handout #2, Handout #3, Handout #4

Lecture 3 — Limitations of Formal Axiomatic Theories

Lecture 4 — The Halting Probability Ω

Additional Reading:

Third Week: Metabiology

Handout #5, Handout #6, Handout #7, Handout #8

Lecture 5 — Problems with Darwinism that may be solved by a Digital Software Organism Toy Model

Lecture 6 — Evolution of Randomly Mutating Software, Random Walks in Software Space

Additional Reading:

[Additional Things to Think About]

Can compare evolvability of two programming languages A, B with the same source of mutations using largest positive integer fitness function: Language A has greater evolvability than language B if with high probability random mutations make positive integer that is calculated grow at faster rate when using language A instead of language B.

A key point in evolutionary theory is the existence of paths leading from low complexity to high complexity organisms passing through a series of small evolutionary steps each of which is slightly advantageous. In other words, there must be a chain of plausible intermediate forms.

Example of an interesting path in software space: Organisms ON consist of a prefix followed by the first N bits of Ω (N = 1, 2, 3, ...) [reversed]. Prefix sees how long it takes the computable lower bounds on Ω to get the first N bits right. This is the smallest K such that (the sum of 1/2 raised to the size in bits of each program ≤ K bits in size that halts in ≤ K seconds) reproduces the N bits of Ω that we are given. These organisms ON are a very small mutation distance from each other [md(ON, ON+1) is small], and the integer each one calculates grows faster than any computable function of N. [Here we are using the programming language and mutation model in Handout #6.]

[[[Here is a better version of this. As before, organism ON consists of a prefix followed by the first N bits of Ω (N = 1, 2, 3, ...) [reversed]. This time the prefix π uses these bits of Ω in the standard way to determine each self-delimiting program |p| ≤ N that halts, and then the greatest positive integer K with program-size complexity H(K) ≤ N, which grows faster than any computable function of N. If we can protect this prefix π from mutations, there is fixed-size mutation-distance neighborhood {X: md(X, ON) ≤ C} of each of these organisms in which the fittest organism will be another organism in this sequence ON+C' that calculates a bigger positive integer. Assuming we have an oracle for the halting problem, this gives us a deterministic evolutionary algorithm instead of a random walk model as in Lecture 6 above. In effect, we are calculating Ω bit by bit.

This alternative model exhibits historicity, i.e., organisms contain their evolutionary history. But it is unsatisfactory from a biological point of view because there are no random mutations and because our software organisms do not exhibit the deeply-nested hierarchical structure that is such a conspicuous feature of biological organisms. Whether organisms can shield part of their DNA from mutations is also a relevant issue, although allowing this as an option is certainly possible when we are dealing with artificial software organisms. If DNA can be shielded when we have random mutations, then we cannot expect to prove that evolution occurs with probability one, only that it occurs with positive probability. [The random walk model in Lecture 6 evolves with probability one.]

Perhaps this example shows that evolution is too easy if we have an oracle for the halting problem? How about a version of my random-walk model for evolving programs which name large positive integers that works without an oracle? Maybe then we will get the desired deeply-nested hierarchical structure? Is there any connection with primitive recursive functions and subrecursive hierarchies? With work by Rózsa Péter? Must think about this.]]]

Call an infinite sequence of software organisms OK (K = 1, 2, 3, ...) an evolutionary history if the fitness function (positive integer that is output) grows and the mutation distance between successive organisms md(OK, OK+1) is bounded. Note that the optimal universal Turing machines used in AIT have the property that evolutionary histories are preserved when a fixed prefix πC is added at the beginning of each organism OK in the evolutionary history, because outputs are the same and mutation distances are still bounded. More precisely, a universal Turing machine U has the most evolutionary histories, because an evolutionary history for any Turing machine C will also be an evolutionary history for U. And any two universal Turing machines U and U' will have exactly the same set of evolutionary histories. [Again, here we are using the programming language and mutation model in Handout #6.]

[20 August 2009]