Distortion Wizard

Dumbass Things I Learned About Numbers and Machines

Do you know what diagonalization is? Let me teach it to you all simple-like – one sweet, well-meaning dumbass to another.

So I was contemplating and reflecting on the P =? NP-problem, like you do, if you're a disturbed person like me. I discovered these cool other problem classes called DLIN and NLIN for which DLIN != NLIN is proven using diagonalization. There's also other techniques like algebrization and natural proofs that sadly don't work for proving P =? NP, but I won't be talking about any of those here (writing).

Diagonalization is a contradiction technique where you list all the stuff that's out there, and then show that the diagonal isn't on the list. You give that diagonal thingy a weird property and then show a contradiction. This results in minds blown. Or it should.

Let's list all the statements in some formal system: S1, S2, S3, ..., and also create a giant table that tells you which statements are provable by that system, either by some math formula or another statement (think the C programming language, think pointers):

StatementProvable
S1yes
S2yes
S3no
S4yes
......
Presume it's a consistent system. Everything is dandy and internally consistent. Now, let's introduce a new statement, G, that states "I'm not provable by this system (applying which exhaustively produces all statements S)". So okay, let's see: if G is actually on the list of statements, then that system of Ss proves that G is not provable by any statement on the list, and so it's no longer internally consistent. A logic error. On the other hand, if G is not on the list, then the list of statements is incomplete.

And so, any system that's internally consistent is always incomplete (the length of the list doesn't matter).

That's cool. But consider the following:

Let's list all the real numbers. But let's list them all weird-like. Let's write them all down in binary representation, repr, just like programming!

nirepr
n1[0.000...0000001]
n2[0.000...0000011]
......
n100[0.000...0100011]
......
n1000 [0.100...0000000]
......
nk[11111...101.000]
where each repr has some length k and all the numbers ni are different (presume unique binary encoding). So, let's do something funky now. Let's construct a new number D by flipping each bit at repr[i] and concatenating those bits. That is, we take !repr[1] from n1, !repr[2] from n2, and so on, all the way up to and including the last number, !repr[k]. In other words, we flip the diagonal, we diagonalize.

And look what happens: the new number, D, cannot be on the list, because it differs from every other number by at least one bit! And so, you can't count the reals (the length of the list, k, doesn't matter – it could be infinite).

Right. Let's continue our completely unhinged, nutty delirium up to its crazed conclusions.

Let's go ahead and list all the computer programs that ever existed: P1, P2, P3, ...

PiHalts
P1yes
P2no
P3yes
......
where Halts means the program will finish. Some of those programs could take other programs as input, some don't. Remember, it's all the programs, none are left out. Again, funky time. Let's create a new program D that runs any program Pi on the list (including itself), then runs forever if that program Pi halts, and itself halts otherwise. We know the result of those Halts because either we have nerves of steel or we have precomputed every answer ;) For the sake of argument, you see.

Yeah okay, that one's a dumb one. But anyway: there's always some program for which the other, pre-existing programs can't decide whether it halts or not. The length of the list doesn't matter.

Finally, the time hierarchy. We'll list all (deterministic) Turing machines that halt in O(t(n)) time: M1, M2, ...

Machineinput
M10010111001...
M21100101010...
M3some bits
......
where the inputs are all length n and there are n machines too (because that's the worst case); t is some time-constructible function, like n. So, create a new machine D that's slow and runs in O(t(n)^2) time instead. Define D so it simulates any machine Mi for up to i steps, and flips the diagonal. So: if Mi(i) accepts, then reject. If Mi(i) rejects or hangs, then accept.

Okay. Now it gets funky: we know for sure our new machine D runs in O(t(n)^2), because we only ever simulate the machines M for up to i <= t(n) steps. But let's presume D is in the O(t(n)) class. Now, we also know we already listed all the machines in O(t(n)), and D differs with every machine Mi on at least one input bit! That's a contradiction. And so, D is in O(t(n)^2) but not O(t(n)). Ergo, with more time you only get more problems.

So, wait a minute. There's a pattern emerging isn't there? Notice that the very numbers themselves are just kinda like the results of some number-generating program. For the integers you just go +1, but for the reals you have a more complex, more arbitrary program. So it isn't that the numbers are like the numbers woo-woo, but they're just some results that arise from computation – from discernment between an input and an output, which is what a program does. The class of programs that halt and output something meaningful is just relatively very small.

Or, if you don't like to think of the numbers as outputs, you could think you're just giving every program a unique ID.

At this point, it so doesn't matter to me whether we're speaking of mathematical functions or relations. Or whether there's any sense in side-effects. Because it's all a matter of adjusting some program. The world is so relaxed, you see. So free.

So then you of course ask: does every program also have a unique time signature? I don't mean the time complexity class. I mean the exact amount of equally divided steps it performs something so it counts as a unique program.

Are we all speaking the same language?

What's a mapping, fool?

let mindblow = the_world
        .map(|humanb| numbah)
        .collect()