questo perche', scrive Yudkowsky:
"If somebody builds a too-powerful AI, under present conditions, I
expect that every single member of the human species and all biological
life on Earth dies shortly thereafter."
cosa che a me pare un tantino incomprensibile:
Marco