## A Simple Refutation of Gödel's Theorem

## August 2001

This paper is a simple refutation of Gödel's Theorem, using as a basis the Web articleby J.R. Lucas. In order to be clear as to which words are whose, I have written hereunder Mr. Lucas's own words in blackA Simple exposition of Gödel's TheoremTimesfont, and indented, while my words are in blue Arial font, and not indented.

Mr. Lucas writes (and I have deleted some of the initial and somewhat irrelevant parts of his article):

Gödel's argument is self-referential. It is a development of the Epimenides paradox, known to St Paul. Epimenides was a Cretan and said that All Cretans were Liars. A modern version would be a backboard on which was written:

In this crude form we should not take it very seriously, any more than the ancients did Epimenides. One obvious flaw, pointed out by Gilbert Ryle, is that the `this' fails to refer properly. `This statement ': which statement? It refers to a statement that is still in course of being made. We don't know what the statement is that is being referred to until the statement has been completed, but it cannot be completed until the reference is made out properly. If we try to make sense of it, we get into a loop: `This statement, namely `This statement, namely `This statement, namely `This statement, namely . . . . .'''''' Gödel circumvented Ryle's objection. He found a way of referring to well-formed formulae of formal logic which was independent of token-reflexive (or indexical) terms, such as `this'. Formal logic has relatively few symbols, and to each of these Gödel assigned a number. He then could code a string of symbols by taking the odd prime numbers in order, and raising each to the power of the corresponding symbol. Thus if we want to refer to ## This statement is false

and numbers assigned are ## ( p V p ) --- p

the number for the string is ## 4 7 6 7 5 8 7

This is an enormously big number, but it is just a number, and in principle we could refer to a well-formed formula by a single number. Instead of working out ## 3

^{4}.5^{7}.7^{6}.11^{7}.13^{5}.17^{8}.19^{7}let me pretend that it comes to 1729. Then it might be possible to write down, in a Ryle-proof manner ## 3

^{4}.5^{7}.7^{6}.11^{7}.13^{5}.17^{8}.19^{7}

or, equivalently ## 1729 - - - - - - - - - Well-formed formula no. 1729 is false

## 1729 - - - - - - - - - Well-formed formula no. 1729 is not true

I am sorry,
Mr. Lucas, but this too will not suffice for self-reference.

First of
all, and as you rightly say, formal logic has relatively few
symbols, and to each of these Gödel has assigned a number.
One of these, as you probably know already, is the symbol
"successor of", often written as **ƒ.** The basic signs,
or symbols, of formal logic do not include the digits from 1
to 9 (inclusive); to write any number down, formal logic
uses just the symbol **0** (pronounced "nought")
preceded by a number of **ƒ** symbols. Thus for example,
the number 3 is represented in formal logic by the string of
symbols **ƒƒƒ0** -- meaning "The successor of the
successor of the successor of nought".

Thus to
write down any number ** x,** formal logic uses a
number of symbols equal to

Now
consider: In order to achieve self-reference *within
mathematics itself,* well-formed formula no. 1729
must *contain *the number 1729, written in the
symbols of the formal language! This can only be done as **ƒƒƒ...ƒ0**,
where there are 1729 **ƒ**'s and one **0** -- a
total of 1730 symbols of the formal language.

And since to
each of the **ƒ**'s as well as to the final **0**
Gödel has assigned a number equal to or greater than 1, and
then coded the string of symbols which says, in effect,
"Well-formed formula no. 1729 is not true" by first raising
different prime numbers to the powers represented by these
numbers, and then multiplying together those prime numbers
raised to those higher powers, any number assigned by Gödel
to the string of symbols which says, in effect, "Well-formed
formula no. 1729 is not true" *must necessarily be greater
*than 1729 ... !

Why, the
string of symbols representing the number 1729 *itself*
must be assigned a number greater than 1729. (And this is
not even taking into consideration the rest of the symbols
comprising well-formed formula no. 1729.)

The number
assigned by Gödel to the string of symbols representing the
number 1729 must be *much, much *greater, in fact,
than 1729: for if as Gödel actually explains in his 1931
paper, the symbol **ƒ** is assigned the natural number
3, and the symbol **0** is assigned the natural number
1, then the string of symbols representing the number 1729
must be assigned a number equal to 3^{3}.5^{3}.7^{3}.11^{3}.13^{3}.17^{3}.19^{3
}... all the way up to the 1729-th prime number
(if we regard 3 as the first prime number), each raised to
the power of 3 -- and this product then finally multiplied
by 1730.

In fact, the
first two terms *alone* of this huge string of prime
numbers raised to higher powers -- namely 3^{3}.5^{3}
-- is equal to 3375, which *itself* is much greater
than 1729. Consider then how gargantuan the number assigned
to the string of symbols representing the number 1729 *alone*
must be.

Thus the
string of symbols which says, in effect, "Well-formed
formula no. 1729 is not true" *cannot possibly*
be assigned the number 1729.

And it
doesn't matter *which* number is assigned to a
formula: *no* number assigned, using Gödel's method,
to a formula of *any* formal system can possibly be
contained *in* that very formula, when that number is
expressed in the symbols of the formal language itself.

**Indeed, every number assigned to
a formula according to Gödel's method must be greater
than any number contained in that
formula itself.**

So even Gödel's method of assigning numbers to symbols, to strings of symbols, and to strings of strings of symbols, of a formal language, does not overcome the fundamental logical objection to the Liar Paradox: which is, essentially, that self-reference is logically quite impossible.

And if self-reference is impossible, then Gödel's so-called "undecidable formula", which must also be self-referential, cannot possibly exist in mathematics.

This takes care of the main thrust of my argument. The rest of my argument now becomes essentially superfluous, but just for the sake of being through, I shall complete my critique of your Web article.

You go on to
say in it:

These two manoeuvres enable Gödel to refer to well-formed
formulae by numbers, and to represent provability as a
property of numbers definable in terms of the simple
arithmetical operations of addition and multiplication, though
the definitions are themselves very complicated. It remains to
achieve *self*-reference, to find some Gödel number,
1729 as we have supposed for the sake of brevity, where wff
no. 1729 turns out to *be* the
wff --(Ex)Pr(x,1729) How can Gödel
achieve this? He does it by means of a ``diagonalization
operation'', like those used by Cantor to prove the
non-denumerability of the continuum.

Cantor argues byReductio Ad Absurdum. Suppose we could arrange all the real numbers between 0 and 1 in a denumerable list. The list would then look like

0.a

0.a

0.a

0.a

0.a

0.a

0.a

Cantor then shows that there is a real number between 0 and 1 that has been left out, contrary to hypothesis.

Let b

Consider the number

My *dear*
Mr Lucas. Cantor cannot *possibly* show that there is
any real number which has been left out of the list, because
the list, being infinitely long, is not complete, and can *never*
be so! Thus there can always be one more row added to the
list in which we can place the newly-constructed real number
0**.**b_{11}b_{22}b_{33}b_{44}b_{55}b_{66}b_{77}b_{88}
... .

(For a more
complete argument in this regard, see my *Simple Refutation of
Cantor's Diagonal Procedure*).

But this is
almost a separate issue: for as we showed above, it is *impossible*
to use Gödel's method of assigning numbers to symbols, to
strings of symbols, and to strings of strings of symbols, of
a formal language, in order to write a formula of that
formal language which can achieve self-reference.

Then you go
on to add:

Let me wave my hand over the enormous amount of careful working needed to achieve this in a water-tight fashion, and claim that we have achieved self-reference.

As we have
seen above with respect to the string of symbols which in
effect says "Well-formed formula no. 1729 is not true", this
other string of symbols, which says, in effect, "wff no.1729
is unprovable in the system" *also* cannot possibly be
assigned the number 1729, because the number 1729 is *contained
within that string, *written as **ƒƒƒ...ƒ0**,
where there are 1729 **ƒ**'s and one **0** -- a
total of 1730 symbols, each of the **ƒ**'s being
assigned by Gödel the number 3, and the **0** being
assigned by him the number 1.

Thus the
string of symbols, which says, in effect, "wff no.1729 is
unprovable in the system", can only be assigned a number ** greater
than** 1729.

So you
cannot possibly conclude as you do, *viz:*

... for there
can be no such self-referential well-formed formula in the
system -- *i.e., *in mathematics itself.

*(By the
way:* It may not be argued that the so-called
"undecidable formula" need not *itself* contain its
own Gödel-number, but may refer to itself *indirectly, *by
saying, in effect, "That formula which is obtained by
substituting the free variable in formula number so-and-so
by its own Gödel-number is not provable". As Gödel himself
writes -- in footnote No. 20 and Definition No. 31 of his
1931 paper entitled "On Formally
Undecidable Propositions of Principia Mathematica and
Related Systems", wherein he tries to prove his
celebrated Theroem -- such a formula, containing as it must
the sign "Subst" or "Sb" (standing for the operation of
substituting a variable in a formula with a number) belongs
to *meta*mathematics, not to *mathematics. *Thus
if Gödel *can* in fact prove that such an undecidable
formula exists, he can only prove thereby that mathematics
by itself cannot decide a *meta*mathematical formula
... to which we should retort, as any school-boy justifiably
might: *"Big deal!")*

As a consequence, Gödel's Theorem stands thoroughly disproved.

There is a
much fuller, but still comprehensible, account in my book *Critique of Gödel's 1931
Paper Entitled "On Formally Undecidable Propositions of
Principia Mathematica and Related Systems"* which I wrote a few years ago in
collaboration with Ferdinand Romero, and which is available
for download in *.pdf *format from my Home Page.

And there is
a short and sweet account of the ideas contained above on my
Home Page too, entitled -- of course -- *A Short and
Sweet Refutation of Gödel's Theorem.*

If anyone
reading this article has any comments, contact me!

*PS:*
Of course Gödel-numbers *themselves* belong to
metamathematics, and not to mathematics, and may not validly
be used *in* any mathematical formula. Any formula in
which a Gödel-number appears must belong to metamathematics,
not to mathematics; and thus if Gödel *can* actually
prove that there *is *such a formula and that it *is
*indeed undecidable, all he can possibly prove thereby is
that it is his *meta*mathematics that is incomplete
... leaving mathematics itself as complete as ever it was!

*PPS:*
One can hardly argue that mathematics and metamathematics
are essentially the same thing: for if they were, it should
be possible to derive *all* of metamathematics from
the axioms of mathematics *alone* (such as the Peano
axioms, or the axioms of Zermelo and Fraenkel, later
extended by John von Neumann). But no one, not even Gödel,
has ever been able to lay claim to having performed such a
superhuman feat.

*PPPS:* Just explain one thing to me: how
did you geniuses at Oxford, Cambridge, Harvard, Princeton
and Stanford -- not to mention the Sorbonne, Heidelberg
and To-dai -- miss these simple points, especially the
ones in the *PS* and *PPS* above ... for *seventy
years?*