He had me at "egg". "[AGI] interprets the Turing Test as an engineering predictio...
He had me at "egg".
"[AGI] interprets the Turing Test as an engineering prediction, arguing that the machine “learning” algorithms of today will naturally evolve as they increase in power to think subjectively like humans, including emotion, social skills, consciousness and so on. The claims that increasing computer power will eventually result in fundamental change are hard to justify on technical grounds, and some say this is like arguing that if we make aeroplanes fly fast enough, eventually one will lay an egg."
From the book Moral Codes - Designing Alternatives to AI, by Alan Blackwell
https://moralcodes.pubpub.org/