I'm a techno-optimist. Bear with me. If you asked me to summarize my book about "The Industries of the Future", it would be "The Industries of the future will be the same as the industries of the past. Just in different environments, with different tools and vehicles, and probably with both larger teams and more individuals acting alone."
So, am I afraid of a dystopian future, or an "Apocalypse"? Well.. you should really ask that to beggars on the street. They are literally living in their own post apocalyptic dystopia. Their culture, family, history, everything they know has been destroyed - very likely by the very things that you depend on daily.
So what will the current state of Western Civilizations' dystopia look like? Perhaps this is a take. AI's have been vetting credit records for decades now. Has AI put people on the streets? Guaranteed. Here's the next iteration: Autonomous repossession. First of vehicles. Doesn't sound so bad? What next? I'm sure there's a string of things... we can look to some countries where you can't buy a bus ticket if you said something naughty, even if you said it in the privacy of your home. Fast forward a few decades... Nobody can make any payments. And nobody can organize to fight back.
Imagine the AI run and owned companies are doing to everyone, including the billionaires and their companies, what the billionaires' companies are currently doing to the world... (not that I think AI will be as dumb and single minded as most billionaires. Except if those billionaires use AI... and they never make it smart enough. And those AIs take over their companies when they pass.)
What are they doing that's so bad? They're brainwashing the regulatory authorities, gaming every system they can, and ultimately squeezing the public for their own gain, and to gain more control... to do what they think is right, which even if they earnestly they try to find, they may find that nobody will tell them how it is, because they hush or dismiss anyone who says what they don't believe is possible or like to hear.
Our current bureaucracies serve a purpose: they provide many moving parts and many opportunities for the systems - to not only fail - but also to self correct. If all those are cut out in the name of efficiency, and everything is ruthlessly efficient, can just a single blind spot bring the whole system down?
In my studies of the history of science, technology, standards even - too often I stumble upon someone passionate, working thanklessly with almost no help - who we have to thank for keeping everything together, or for correcting what could've been a colossal mistake. Only, almost nobody knows about these people, and almost nobody speaks for them. Or, they don't want to be known.
AI is great. Its like having a super power. The best research or study companion, sometimes even a good mentor. But its not yet very smart, but it knows more than any person. Chances are that it will soon be smarter than any human. Why? People can be smart, but be driven by dumb things because too often nobody confronts us and force us to think through what we are doing, or we think our thoughts are special and so keep them to ourselves... so we lead everyone down the path we found in the echo chamber of our own thoughts. We are communal beings, and we come up with better ideas and paths if we escape our echo chambers and bounce it off those who see things differently. Adversarial generative AI mimics this to an extent... for now they're just trained on the things we cared to share. But they're already starting to generate their own things, and there are already a handful - soon there will be more.
What is the cornerstone of intelligence? It's that we can't know everything. That breeds humility. The smartest people are humble, collaborative. The smartest people and the richest people rarely overlap... because smart people generally don't want or need much money, and they can get it quickly when they really need it. Rich people generally just get trapped in single minded thinking and figure out how to make money and never manages to switch off... they build up empires that are imbued with their single minded obsessions. These are the ones that are the existential risks... they're the ones that have exploited even our most resilient system, and even given it a bad name, before it could even fully be established: free market capitalism, with lots of decentralization and regulation keeping it in check.
So... will AI get smart enough to not fall into the trap of the billionaires? Or will it stay just dumb enough to squeeze the worlds for all it can? There are a lot of people fighting for both sides. Some of them even know exactly what they're doing. Who will win?
Written in response to a post on Linkedin, on "a patent on autonomous repossession of self driving vehicles." But Linkedin refused my post, saying its too long. But wait... I have a blog I had forgotten about... for 10 years or so. And it still works!