Business · Wired
OpenAI Really Wants Codex to Shut Up About Goblins
Compiled by KHAO Editorial — aggregated from 1 outlet. See llms.txt for citation guidance.
◌ Single Source
Instructions designed to guide the behavior of the company’s latest model as it writes code have been revealed to include a line, repeated several times, that specifically forbids it from randomly mentioning an assortment of mythical and real creatures.
Key facts
- AI models like GPT-5.5 are trained to predict the word—or code—that should follow a given prompt
- OpenAI’s newest model, GPT-5.5, was released with enhanced coding skills earlier this month
- I was wondering why my claw suddenly became a goblin with codex 5.5,” one user wrote on X
- Even Sam Altman, OpenAI’s CEO, joined in with the memes, posting a screenshot of a prompt for ChatGPT
Summary
“Never talk about goblins, gremlins, raccoons, trolls, ogres, pigeons, or other animals or creatures unless it is absolutely and unambiguously relevant to the user’s query,” read instructions in Codex CLI, a command-line tool for using AI to generate code. It is unclear why OpenAI felt compelled to spell this out for Codex —or indeed why its models might want to discuss goblins or pigeons in the first place. OpenAI’s newest model, GPT-5.5, was released with enhanced coding skills earlier this month.