"Do not hallucinate": Testers find prompts meant to keep Apple AI on the rails
2 points by chha 8 months ago | 1 comment- zahlman 8 months agoDoesn't the concept of AI "hallucination" arise from observing previous LLMs? Would the training data for current LLMs include anything that would let them build a model of what such hallucinations entail?