ABU DHABI — The Russian company that gave the world the iconic AK-47 assault rifle has unveiled a suicide drone that may similarly revolutionize war by making sophisticated drone warfare technology widely and cheaply available.
The KUB is four feet wide, can fly for 30 minutes at a speed of 80 mph and carries six pounds of explosives, the news release says. That makes it roughly the size of a coffee table that can be guided to explode on a target 40 miles away — the equivalent of a “small, slow and presumably inexpensive cruise missile,” according to a report by the National Interest website.
Drones like this don’t actually need human pilots, they just need simple algorithms and GPS coordinates. Or mobile device signatures to target.
With 100 of these devices, fully autonomous, carrying incendiary devices, I am pretty sure a single individual could surpass the destruction of 9/11.
Not to mention of course that these kind of devices will further inspire commercial drones to be refitted with explosives and worse to bring about further chaos.
So, what does this have to do with AI? Well, my point here is that just as empowering AI should be considered as dangerous as building a weapon of mass destruction, and should be opposed by all countries and people everywhere, this kind of device should be immediately opposed by all civilized nations.
Because how can chemical or biological weapons be banned, but not devices that will spread random chaos and terror? That seem purpose built to be able to anonymously target crowds of innocent people?
And perhaps our world leaders will think twice when they realize they won’t be able to go outside again, ever, when these devices exist in the wild.
The idea that a company is building these for profit to distribute to militaries around the world is a sickening indictment of capitalism. Because there is a free market for terror and dominance, and the free market for peace does not intersect with it.
If these are military tools, then ensure their use is limited to only militaries with oversight, that must follow the Geneva Convention, and do this by making sure each device is trackable after destruction. Because just like AI, individual people need to be held responsible for the misuse of technology.
I will write more about the idea of Hypocrisy, but for now, I say this: it is not Hypocrisy to support a strong military that uses drones, but also believe weapons like this are immoral. Because scale matters. Guns may be amoral, but the saturation of firearms in a society might be considered immoral because of the threat it poses to all. Scale matters, availability matters, cost matters. And a million dollar military drone, despite all the horror and mistakes made with them, is a different ethical creature than a $500 one.
Trust no robot, and certainly do not trust Kalashnikov.