Missed Shot at Artificial General Intelligence

  • If superhuman intelligence requires persistently perfect training data, then maybe we should just admit to ourselves that LLMs are physically incapable of attaining "AGI".