Does anyone else sense, as I do, that this apparently-heartfelt letter is a little like the mother who, as she prepares to leave home, counsels her children, "... and while I'm gone, do not stick beans up your nose?"More than 1,000 tech experts, scientists and researchers have written a letter warning about the dangers of autonomous weapons.In the latest outcry over "killer robots", the letter warns that "a military AI [artificial intelligence] arms race is a bad idea".Among the signatories are scientist Stephen Hawking, entrepreneur Elon Musk and Apple co-founder Steve Wozniak.The letter will be presented at an international AI conference today.
You just know someone is going to find a way and an excuse for "improving" things.
And then there's the question of whether, given the sometimes stumbling and sometimes horrific uses to which the intelligence in hand is used, creating an artificial intelligence makes a whole lot of sense. Wouldn't you first want to learn the uses to which current intelligence could be put before you added another stratum of intelligence on top? Is a naturally-occurring intelligence somehow lacking? How? And wouldn't it be worth correcting that first ... and then get out the Tinker Toys?