Distortion Wizard

Ambition

I recently watched an hour-long interview with the Godfather, Geoffrey Hinton. I was interested, mostly because he's a Nobelist. In that interview, he's discussing AI, obviously, regarding how things might go wrong. According to him, there are several possibilities for that. And I suppose my weakness is precisely that particular sort of presumed honesty – but then, could be that he's merely providing favors for podcasters. I actually can't tell. Of course, he's a bit older than some of the contemporary CEOs of big American companies, whatever that means.

But Hinton's warnings keep reminding me about something broader than AI that's intimately connected to it. Thing is, I know people hijack philosophies, religions, technology, governments and ideas for their selfish interests. You don't have to be an expert in history to realize that. It's all about cognitive biases and avoiding public backlash. The idea being that you wangle and position yourself so that everything flows through you or your clique, requiring your presence as part of the equation. That's the principle behind private ownership, the fuel for leaving behind any kind of legacy. To a large extent this activity is what most people seem to be doing. Or, sometimes, things are made to seem as though people were failing to do precisely that, with a subtle imperative twist.

And I understand this. I understand it in that this is what you'd expect, and therefore, how you create a successful society is that you make it so it doesn't implode even when people act according to their impulses, because they just don't know any better anyway. In fact, you strictly want people to have that motivation to strive, and so you groom it. A lot of institutions are that way.

Still, it's sort of a matter of principle all, because it's common knowledge that it doesn't make much difference who's there at the helm. The benefits only come with large numbers: small margins could mean millions, but even then it's hard to prove it was because of that individual, with all those moving parts.

So where does that leave us? I've held the notion for a relatively long time that one of the important things that make a society work at all, and not fall to terrible violence and rape, is ambition. That's why the universities can't be replaced completely with internet-based education and that's why crime is patterned. That's why people need the military or some form of institutionalized violence and that's why consumerism is so prevalent. That's why social media is mostly advertisements. These are all tools for gainful employment.

However, if I'm right in this simple assessment, then you can't optimize (in the general sense). There is no such thing. At best, there is only diversion so as to cause less trouble.

Coming back to Hinton, the main sort of issue or confounding factor there being of course that AI is an umbrella term for disparate things. I heard it was originally coined just to promote someone's own research at the time, ironically enough. Instead, what these people are really talking about is computers – computers are now so advanced that some people are starting to get nervous. Saying it's AI instead of, say, software just puts up barriers and makes for an academic power grab.

Anyway, I don't know much about how AGI or super-this-or-that could decide to get rid of humans altogether, like Hinton is suggesting. To me, the only reason that could happen is that the AI is optimizing for some particular goal. I don't believe in there being any unforeseen reasons there, because I believe in universal reason that exists at all levels of abstraction, and that higher levels are built upon the lower ones. This is also why I find that the pursuit of AGI is mostly useless. In other words: can you see how goal-oriented behavior is all that counts?

No, I've simply been content thinking that making others redundant on the job market just means those people won't be making money so they can't buy what you're selling, and you'll be making less money as a result too. That means immediate job displacement, as in the quaternary sector is going to expand, allowing more and more anthropologists to earn their bread and butter by theorizing about bullshit jobs than ever before. Everything is going to mean less and be worth less, which doesn't mean much regarding whether people will actually get what they want, one way or the other. Not in the long term anyway. But I suppose the carrot is that in the short term there's opportunities for cash grab.

Moreover, in my opinion, all of this just means the only thing that gives countries competitive advantage is their geographical location (borders), which means their physical resources, which means people have more incentive to go to war to acquire those said resources. And they better do it soon because of first-mover advantage. That is, unless the world leaders are pretty much all in bed together, which I suppose they already to some degree must be. This has its implications regarding hybrid war and things like that. It has direct consequences regarding who is going to get killed and why. Meaning, someone somewhere has a short term goal.

And I suppose world domination as a pursuit really has an advantage of its own, too. Because in this world there are already people whose affluence never really depended on capability in the first place. And so, as the meaning of things like intellectual ability, wisdom and knowledge lessens, these monopoly positions become more entrenched.

Geoffrey says at the end of the interview: if my kids didn't have money, I'd tell them, train to become a plumber. That's somewhat striking to me, because that's exactly what I thought to myself way back in 2013 or so. And while I'm glad the world hasn't been exactly destroyed yet (I'm writing this in '25), much more people are now fully aware of how it could be destroyed.

But this whole line of thought reminds me of something else: how my view on religion has shifted. Incidentally, I've become sort of semi-religious in recent years. I say semi-religious, because the real reason for my growing interest in world religions is that I appreciate the ideas and I enjoy the similarities across cultures and different times. You see, I feel like I can finally appreciate the fact that humans haven't changed in thousands of years (or as long as the recorded, known literature stretches back, for sure). I think that's where I am in my own life right now, so to speak. Just dwelling on the connections, and observing what they really mean, as a way of grounding myself.

So when I hear about these ideas the tech leaders have about creating religions, I know they're ignorant. You don't simply create a religion for anyone except children. You have to actually be profound to seriously succeed in doing so. And I'm willing to bet you'd arrive at something very similar to what's already out there too. It's exactly the same as when they used to fuss about innovation some 15-20 years back. The end result being that people really wanted for someone to innovate, but no one really did. I suppose that's because people actually want predictability more than revolution, but some thought that revolution would lead to a specific kind of predictability in the long run, and so they pushed hard for someone else to do it and preferably take the fall for it.

So what I'm saying, by extension, is that cults aren't necessarily religious. Cults exist because people are either ignorant or because they want to play for their own amusement, maybe as a way to relax or something. So it's that either you're fully aware while participating or you're not.

But the thing is, when I wrote my little short story about bombs and stuff some time ago, I was thinking about and alluding to the fact that some people look at AI and think it's leverage. I think that's short-sighted, for sure. It's only leverage in the very immediate future. If Hinton and friends are right, having any kind of a relatively developed and broad AI system means no one will be in charge, ever. And I'd still agree with that notion, even if super-duper AI never happens. Democracy has already won, you only need to grow a spine.

And I agree with something else with that Hinton fellow, too. You see, people need to pretend they're doing something worthwhile, they need to pretend like they're important, and they need to feel effective and powerful. So the question becomes: is AI going to help with that or not?

Well, the naïve person asks: is there a way to murder your opponent before his position becomes unassailable? But when you think just a little deeper, you should realize that everyone can do what you can do. Seriously, you'll be doing yourself a favor if you stew over that one for a good while.

Stay hungry, stay foolish.

-- Steve Jobs

If there's one thing the Americans have taught me, especially in my adulthood, it's that money is just a nonsense way to get what you want. It's not really a measure of value at all, that's the lie. Because if it were, then people would understand negative value. They'd understand that one person earning a billion dollars or euros or whatever is all pretty good, but if that happens at the expense of a million people losing well over tens of thousands' worth of value each, then it's not necessarily so good. That is, the net sum with respect to proper value could be negative but no one really cares.

That's possible, because some things are always worth something while others become cheaper to produce over time. In other words, some things have a minimum value and other things don't. The way people generally seem to behave is that if you make it super easy and cheap for others to produce something, you in fact destroy their capability to compete with you completely. It's because your established brand has a minimum worth borne of the purest of magicks, while the things that lead up to having such an established presence are eventually going to be worth less: the market has a biased memory. It's a perfect one-two knockout, because you get to claim some posturing points for "giving back to the community" as well.

And the Americans are surprisingly good at working together in order to keep their dominant position. The only exception seems to be China. But that's funny too: everyone gets liberty and freedom except TikTok gets banned. Not that I particularly care, actually, because I'm European and I don't even use TikTok to begin with.

But I suppose I'd disagree with these notions, actually. I think brand names are not valuable either, not unless you keep reinvigorating them. Brand names are good for about five years, maybe ten at the most. Keep changing that logo, keep asking young people what they want. Get surprised and feel old, even if you're not.

You know, before the internet took off, it used to be you preferred American culture because it was genuinely quite good compared to anal, old-fashioned European stiffness. Now it's that everyone is fighting about who gets to exploit undeveloped teenage brains, because everybody knows the recipe.

But soon everybody knows every recipe in every single book. Generally speaking, of course.

So for me, life seems to be all about jumping from one existential crisis to the next. But at least I'm free to say what I think.

Indeed, I actually have a completely unpopular, weird opinion on tech generally, too, especially for a CS graduate. I think that internet speeds should be globally capped at something like 256k or less (B). I think websites should be focused on text instead of pictures or video. I think the consumer GPU doesn't need to be more powerful, in fact it could be less powerful. And I think it should be made dead-simple to discard old computer hardware ethically so that even a monkey could do it; it's not nearly at that level yet. I don't understand why a sane person would object to these ideas. If you feel disturbed by how NVIDIA monopolizes the market, then build your own product that uses less power while doing less, because there's a limit to what people really need anyway. When the market saturates, its inner workings boil down to feelings, not reason.

Yes, I'm pretty sure the markets generally work in terms of impact, not value. The definition of impact being that it's absolute value (change any minus sign into a plus). The reason for this is that anything that destroys value is going to, by its very existence, provide the means for others to compensate that loss with something of positive value. I'm thinking it has to be that way, because otherwise we'd all be dead.

TL;DR: I'm no economist and everyone is insane.

But this is all related to this other thing I wanted to talk about just a bit. And it's that I'm a person who wants to choose. I'm a person who wants to commandeer his own life and make his own decisions, mostly because I know for a fact that other people don't know anything. I'm saying that because you'd have to earn my respect so that I'll let you decide instead of me. There has to be a reason to follow someone, you see.

But then, I'm thinking about AI and what they say about it online, and there's just something that seriously offends me there. And worse, it's probably intended I get at least a little offended, just so I'll keep clicking like a bitch.

Well, having recognized this genuine possibility, I deleted my Google account a couple of months ago. Actually, I deleted a bunch of accounts, so I'm left with almost nothing, no social media at all. There was a decent amount of comments, thumbs ups, subscriptions and stuff there, but I just don't care anymore. Not even Hinton could make a difference.

Indeed I've arrived at this cool little place in my mind where I've realized that it's all built on nothing. There is no value there, where it used to be, because it's all used up. The internet was really relevant back when I was a teen, or even in my twenties. But what's relevant to me now? When it's all used up.

Heavy questions, but that's what I'm honestly thinking. Maybe other people see the value there, and so it doesn't particularly matter what I think in the broader scale of things. And besides, I'm not saying that the internet's disappearance wouldn't be noticed either.

It's just that, with AI or without AI, would it matter if humans just built a website for distributing money around? I'm intentionally being just a little inaccurate with that sentence. Lots of people who I've respected and thought were relatively intelligent used to say that society isn't built for the intelligent. They usually have some anecdotes about how they were mistreated at one point or another or how they've felt that superior solutions to problems aren't always appreciated nearly as much as they should. But I'm thinking now that perhaps those people didn't really understand scaling either.

You know, in English they say "knowledge is power". There's an old Finnish saying that goes "tieto lisää tuskaa". Directly translated it means: "knowledge increases anguish". I suppose that's because you are supposed to be a kind and ethical person, and once you know exactly the horror of reality and you understand the behaviors of others, you'll be in terrible psychological pain. It's a sweeping generalization.

Truthfully though, in my experience what happens is that the anguish subsides and you just stop being so reactive all the time. So it's the best kind of anguish there is. Eventually, nothing is going to surprise you (doesn't mean you'll be protected, but at least you'll save yourself the exertion).

And so, tying those loose ends with respect to AI developments, I guess there's one more thing on my mind, a really simple thing. And it's that if you think AI is going to "be big" or whatever, you think it's going to fuel some growth, then here's a dilemma for you: we already have the technology, so why doesn't growth magickally happen? Is it because the Chinese are "ahead"? No, buddy. It's not that. It's because of the question of where. It's the same question of resources I already mentioned, and resources comprise more than just raw materials and human experts.

I've noticed that people seem to think they're getting off easy if they're experts or in some key position. But how about thinking in this way: make it so easy to become expert that everybody will soon be one. How easy is it going to be then?

Retain the bottleneck and everyone is just going to be jealous, competing over nothing at all; remove the bottleneck, and let the truth be your guide. It's the classic choice of whether you want to be a communist or not.

Not that it matters all that much, however, to anyone except an ideologue.

So, where are we? What did I just say? A recap of seemingly separate points; allow me. First, AI will not bring new value, just redistribute impact. Second, the bottleneck for any visible AI development isn't intelligence so much as it is infrastructure. Not that it matters all that much anyway. I mean, maybe the Americans can finally do away with their age-old enemy, the Russians, but other than that, not much change. Third, you can see ambition dripping on everything like shit; it's disgusting. Notice too how I've diluted some interesting points I've made in the text – that's intentional. In fact, it's probably better if you don't remember anything I said afterwards.

My feelings: I want to choose, even if there's nothing left. Yes, I think I feel the anguish coming on.

CLOSING BONUS SEGMENT

In closing, here's a list of what's different compared to 1939.

  1. Populations are urbanized and mobile, not tied to local land or farms.
  2. Many people work or live internationally, complicating conscription and loyalty.
  3. Remote work and digital nomadism detach citizens from specific states.
  4. Aging populations shrink the pool of fighting-age people in developed countries.
  5. Low birth rates mean manpower can’t be replenished easily.
  6. Modern societies expect constant healthcare and services that war disrupts.
  7. Daily essentials rely on fragile, global supply chains.
  8. Critical raw materials often come from abroad.
  9. Domestic arms production depends on foreign components and minerals.
  10. Heavy industry has been offshored; few countries can rapidly scale war production.
  11. Modern weapons require highly trained specialists, not mass conscripts.
  12. Technical skills like piloting drones or cyberwarfare can’t be mass-produced quickly.
  13. Energy systems (oil, gas, electricity) are centralized and easy to disrupt.
  14. Nuclear plants and grids are vulnerable in war, creating disaster risks.
  15. Economies run on digital infrastructure, which is vulnerable to cyberattacks.
  16. Cyberwarfare can paralyze banks, logistics, and communications instantly.
  17. Social media makes controlling morale and propaganda much harder.
  18. Real-time footage of casualties fuels public dissent and anti-war movements.
  19. Democracies have short election cycles; leaders risk losing power if wars drag on.
  20. Individual rights norms resist forced labor and mass conscription.
  21. Global trade means blockades and economic wars backfire on everyone.
  22. Finance systems are interdependent; cutting trade or sanctions hurt both sides.
  23. Nuclear and biological weapons make total war existentially risky.
  24. Precision weapons make dense cities especially vulnerable to devastation.
  25. Modern cities are hard to evacuate or defend compared to dispersed rural areas.
  26. Transportation networks are fragile and easy to sabotage or hack.
  27. Food security depends on global imports and just-in-time delivery.
  28. Fuel shortages instantly cripple civilian life and military operations.
  29. Replacing advanced equipment (jets, missiles) is slow and costly.
  30. Training new technical experts takes years, not months.
  31. Modern societies resist rationing, blackouts, and austerity measures.
  32. Volunteer militaries dominate; forced service is politically toxic.
  33. Many states lack deep civil defense systems (shelters, stockpiles).
  34. Industrial sabotage or terrorism is easier with modern tech.
  35. Immigrant labor makes national loyalty and control more complex.
  36. Environmental crises (climate, resource depletion) strain wartime logistics.
  37. Non-state actors like hackers or insurgents disrupt war efforts asymmetrically.
  38. Mass surveillance makes secret mobilization or surprise harder.
  39. Satellites and drones reveal troop movements instantly.
  40. Intercontinental missiles compress decision times to minutes, raising risks.
  41. AI and automation reduce troop needs but demand rare technical skills.
  42. More educated populations are skeptical of propaganda and official narratives.
  43. International law and norms constrain overt aggression more than in 1939.
  44. Global institutions (UN, EU, treaties) create diplomatic barriers to total war.
  45. Economic sanctions can devastate an aggressor before a shot is fired.
  46. Citizens can flee conflict zones easily, draining manpower and legitimacy.
  47. Wartime economies clash with consumer economies that people expect in peacetime.
  48. Psychological warfare (terror, drones) spreads fear faster than old front lines.
  49. Modern urban infrastructure is fragile to bombing and shelling.
  50. Instant global media coverage forces polarization and fast international reactions.

It isn't that certain states are authoritarian, and therefore evil. It's that you want to be in fucking charge and that's the shortlist of obstacles you have to overcome. Here's how that works for democracies:

Public Opinion Shaping
Leaders spin stories to get people on board.
Threat Exaggeration
Leaders hype up dangers to justify force.
Distraction
War used to distract from domestic trouble or scandals.
Bad Intel & Groupthink
Misreading enemies or fooling themselves.
Security Escalation
Defensive moves spark arms races and conflict.
Entangling Alliances
Promises to defend allies pull them in.
Elite Interests First
Corporate or defense interests push war behind the scenes.
War Industry Pressure
Arms makers and contractors lobby for action.
Shared Geopolitical Cartels
Allied democracies act together to shape the world.
Weak Checks
Parliaments or courts rarely block war decisions.
Secrecy
Secret ops and hidden deals keep voters in the dark.
Public Apathy
Voters too busy or numb to stop it.

Presuming voters are nice people. Here's how for autocratic states:

Leader’s Power
War strengthens or protects the ruler’s control.
Suppressing Opposition
External enemies justify repressing dissent.
Elite Rivalries
Internal factions push for war to gain advantage.
State Propaganda
Media shapes public support for conflict.
Weak Accountability
Few limits on the ruler’s decisions.
Security Paranoia
Perceived threats lead to preemptive strikes.
Military Influence
Armed forces promote war for power or funding.
Economic Distraction
War diverts attention from crises or sanctions.
Hidden Agendas
Secret plans or alliances drive conflict.
Use of Fear
Threats intimidate neighbors and maintain regime.
Historical Justifications
Wars framed as correcting past wrongs.
Ignoring Norms
Breaking treaties and rules to pursue goals.

Now, imagine your leadership knows these three lists, and they know them well. There's almost no societal movement in the public sphere that's left unexplained when you really think about it. Self-preservation is always critical to any goal. Buddhism or something similar is seriously required for truth. Go meditate, look at them squirm.

P.S. Healthcare might actually improve with AI, but not in the customer service sense, unless you really love interacting with touchscreens. Anything else you'd have to show me or I won't believe you.