Chat GPT4o Hallucinating
ChatGPT 4o just gave me the worst hallucination I’ve experienced from it so far.
I suspect many of us have seen the odd glitch or hallucination with ChatGPT since it broke cover about 18 months ago.
- code in a script say that doesn’t quite work but isn’t a million km off
- a skewed answer misinterpreting maybe a less than ideal, with hindsight, prompt.
For me, often, iteration usually fixes things by enabling me to clarify the prompt. Only a couple of times has it gone wrong without recovery, where it oscillated between one bug in a script and a second bug, and it couldn’t see its way out.
Last week ChatGPT site gave me access to ChatGPT 4o for free, a few days after it was announced. I’ve recently been messing around a little with both Haskell (the functional programming language) and publicly available information on satellites and their orbits, where you can freely download the orbital element data for all satellites to see their orbits, if you can do the maths.
Just for fun I asked ChatGPT if there was a Haskell package for handling the TLE date. I knew there didn’t appear to be, was wondering if instead it knew something Google wasn’t telling me. It immediately started to tell me about a package, with some example code on how to use it, which I hadn’t even prompted for. When challenged, it didn’t even try to defend itself. All very bizarre.
Details of the exchange are below. One anecdote like this doesn’t give any conclusions, but does leave me intrigued about where this whole new industry is or is not going.
Prompt 1 - Provoking Hallucination
The prompt was simply:
The first few lines of the response were:
Yes, there are Haskell libraries that can read and manipulate orbital 3-line TLEs (Two-Line Element sets). One of the libraries you can use for this purpose is haskell-tle.
haskell-tle haskell-tle: This library is designed for parsing and working with TLE data in Haskell. It provides functionality to read TLE strings, parse them, and access the orbital elements. Installation
You can install haskell-tle using Cabal or Stack, which are the standard package managers for Haskell.
Whilst the instructions given, if such a package existed, would be fine, its all rather pointless when there is no such package.
Screenshots of the exchange are below:

hallucination part 1

hallucination part 2

hallucination part 3
Prompt 2 - Routine 2nd Prompt
Before checking the first response I asked a fairly routine second prompt which gave a fairly reasonmable answer. Not particulalry deep or useful, but not otherwisee wrong.
The response:

hallucination part 4
Prompt 3 - ChatGPT4o admits its wrong
I’d by now realised the initial response was suspect, so asked it for a link to the library, without saying I couldn’t find it myself.
The response was:

hallucination part 5
Conclusion
I have to wonder why for queries like this, where it can query the net live, it isn’t trained to do it to check for existence of things like this; its not like in this case there’s any material out there it might have been ‘confused’ by like a wrong Reddit post.
This is the worst hallucination I’ve experienced from ChatGPT so far, and is otherwise believable. Its almost the reply one would write if you just assumed the package existed in a desire to help. In that sense its quite a well written fraud.
Aside:
Anyone who is more interested in the original request here could look at a few sites, not least:
- a Javasciript library doing something along lines I was asking for
- sources for satellite data Celestrak
- another source Space-Track
- one of the many good texts for learning Haskell, free and online: Learn You a Haskell for Great Good!
- one of the many sites that show maps in 2d and 3d of satellite tracks - click on ‘spacecraft’ in the top menu bar satellite map