He doesn’t paint a pretty picture, either. Though we’re enamored with recent developments like Google Now or Siri, Hawking cautions against what the future could bring. Saying “creating AI would be the biggest event in human history”, it could also be much more dangerous:
In the near term, world militaries are considering autonomous-weapon systems that can choose and eliminate targets; the UN and Human Rights Watch have advocated a treaty banning such weapons. In the medium term, as emphasised by Erik Brynjolfsson and Andrew McAfee in The Second Machine Age, AI may transform our economy to bring both great wealth and great dislocation.
Hawking, like you or I might, also fears the worst. He aptly points out that we have no method for reigning in development. In a very Terminator way, Hawking fears machines will essentially become self-aware, saying “machines with superhuman intelligence could repeatedly improve their design even further”. We would have created intelligence, and in turn created evolution on a different scale.
Hawking also notes “the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.” While most will watch Johnny Depp become a machine and gleefully giggle, Hawking is operating on another level. He makes salient points, too.
Those tasked with building the services we all love are also burdened with throttling it. A step too far could end up being a step into oblivion.
No comments:
Post a Comment