Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It would still need an objective to guide the evolution that was originally given by humans. Humans have the drive for survival and reproduction... what about AGI?

How do we go from a really good algorithm to an independently motivated, autonomous super intelligence with free reign in the physical world? Perhaps we should worry once we have robot heads of state and robot CEOs. Something tells me the current, human heads of state, and human CEOs would never let it get that far.



Someone will surely set its objective for survival and evolution.


That would be dumb and unethical but yes someone will do it and there will be many more AIs with access to greater computational power that will be set to protect against that kind of thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: