In 2003, Swedish philosopher Nick Bostrom thought about what would happen if an advanced AI was given a very simple task: to make as many paperclips as possible. In Bostrom’s scenario, this “paperclip maximizer” realizes that in order to make as many paperclips as possible it must harvest every atom in the universe for materials. Realising that humans would never allow this, it hides the rate at which it improves its own intelligence. Once it is ready, it hacks into every machine connected to the internet, and quickly eradicates humans. It uses the machines to advance technology and build spaceships. The spaceships launch, ready to harvest every single other atom in the universe, leaving a trail of paperclips behind…
Bostrom came up with this simple scenario to illustrate the danger of forgetting something during the programming phase – in this case forgetting to specify values other than making paper clips, such as for example the value of human life.
Of course while this is a ridiculous example, it is easy to imagine that something less obvious but equally catastrophic could be overlooked, especially as researchers rush to be the first to make a true AI, unable to take necessary precautions for fear that someone else will succeed first.
You can play a game from the perspective of a paperclip maximizing AI here.
Click here to learn more about AI.
Reviews
There are no reviews yet.