It’s about the object of the game.
“Mark it, dude”, this was the week that AI ran out of games to play.
AlphaStar from the Google Deepmind team has been declared the clear victor in StarCraft II play.
StarCraft II is a game with no clear path to victory, a massive play space, and with imperfect information (the computer was not allowed to see the whole gameboard and had to move a camera like a human would.).
Plus, the computer was slowed down to sub-human response speeds. It still destroyed us.
These researchers have achieved an amazing goal and deserve to be applauded.
So, with this challenge falling along with chess, go, poker, etc, it’s pretty clear that AI can win at any game we throw at it. And I have no doubt that in ten years or less this kind an analytical and multitasking power will live in every phone on the planet.
So what is next? Well, the game that makes the most sense for researchers to turn to next is the game of Life. Not the big complex questions like weather prediction that computers already excel at. The smaller questions like: who should I marry? Where should I go on vacation? Should I ask for a raise?
Is it worth my time to make a friend, watch a sunset, or pet my dog?
Life has no clear objective, unless we give it one. Even survival is negotiable. And Life, even if you can say to “win” it, is not a zero sum experience. There are others here and they matter too.
And yet, it will be built. A model of you. Not just to serve you ads and take your attention more effectively, though it will certainly do that. But to guide you through the world. To protect you. To give you a fighting chance in a world where the rich and powerful will have their own armies of AI to exploit you.
The problem with AlphaYou, as I will call the enhanced AI that will model and learn from you, will be that it will become an oracle that we cannot resist consulting, on everything. And without carful curating, this will be its own form of prison.
Because there is only one reality we live in, we will not be able to confirm its advice or predictions. If AlphaYou tells us to not have children because we don’t want them and they are too expensive, and we live a fulfilling life, how do we really know we made the right choice? You won’t, but I predict people will say “well, it’s better than nothing” and keep using it.
The truth is, we don’t know the answer to that question even without an AI presence chatting in our ears. And that’s the joy. Thinking that our choices are similiar to choices in a game is a grave category error, since those choices actually form the basis of our connection to all existence outside of our Singleton.
I am not saying we should ban this kind of research. We can’t. Because truthfully, this might be the only actual Moral AI, since it will be focused on you as an individual and maximizing your existence. You will still be making choices, and will be responsible for their outcomes. And such a guardian angel might be needed to inform you when you should change jobs, get out of the path of a tornado, avoid clicking on something, or even detect nefarious AI or bots behind disinformation campaigns.
But will it be an advantage or a crutch? Ask the same question about roads. If they were not built, could you even have civilization? Would you even be here? Of course not. And yet nobody questions if roads are moral or not, even if they kill millions and allow the rapid spread of disorder. The benefits are too immediate.
So it will be with AlphaYou. The calendar and contacts and social networks on your phone are just little dirt paths compared to the superhighways of AI to come. It is up to our current legal and political sphere to start taking their responsibility to protect us seriously, and ensure that, like roads are regulated, so are models of you.
We must have access to our models. We must ensure their purity of purpose, or all is lost.
We are Singletons. Prepare to fight for your distinction as a singular biological entity. If AI can be used to help that fight, we would be foolish to deny it.
Painting by Simon Stålenhag