Discussion about this post

User's avatar
Dan Henry's avatar

Very interesting read but the truth is it is highly unlikely that humans and ASI will co-exist so working towards that goal, which is the GOAL, is suicide.

Also, when the software pushes out wrong information it isn’t a “hallucination”, it is an error, a mistake. Trying to mitigate this fact reveals the hubris and ultimate idiocy behind this entire endeavor.

Expand full comment
Joel J Miller's avatar

“A deliberate rejection of premature certainty” is wise. Thanks for this.

Expand full comment
34 more comments...

No posts