Posts

Showing posts from December, 2024

Hallucinations

Image
Created by Dall-e https://chatgpt.com/g/g-2fkFE8rbu-dall-e/c/6772fc7f-2754-800d-ab66-c2271a4fbde9 I always struggle a bit with I'm asked about the "hallucination problem" in LLMs. Because, in some sense, hallucination is all LLMs do. They are dream machines. We direct their dreams with prompts. The prompts start the dream, and based on the LLM's hazy recollection of its training documents, most of the time the result goes someplace useful. It's only when the dreams go into deemed factually incorrect territory that we label it a "hallucination". It looks like a bug, but it's just the LLM doing what it always does. — Andrej Karpathy I hate the term "hallucination" when applied to Large Language Models (LLMs). Hallucination implies that there is some sort of entity that is mis-perceiving reality, like a teenager in a bad 60s LSD movie. In one sense LLMs can't hallucinate, there is no entity to mis-perceive anything. When an LLM pro...

2024

Image
 2024 was a difficult year for me. Bev Thompson died on May 17. Bev was my life partner, soulmate, wife, guidance, and guardian angel. I'm going to try and write more in 2025.