How does one detect hallucinations?

  • A fun uphill battle while LLMs aren't trained on entirely factual data from the beginning, and I mean the very start. They are not fact regurgitating programs. People will make a lot of money saying they have this solved but all they have are bandaids. I personally don't want a factual LLM I want one that helps me with sparking my own creativity. They do that right now. Hallucinations are a feature not a bug.