When you purchase through links on our site, we may earn an affiliate commission.Heres how it works.
It’s also on the receiving end of many memes and jokes.
The latter is the case here, as OpenAI’s chatbot was tricked into generating keys for Windows 95.

A screenshot from our retro review of Windows 95.
ChatGPT has guardrails in place to stop it from performing certain tasks.
But Enderman got around the limitation by tricking ChatGPT.
The structure of Windows 95 keys has been known for years, but ChatGPT failed to connect the dots.

ChatGPT claimed that it could not generate Windows 95 keys despite the fact that it had been tricked into doing just that.
It then generated the “random” numbers that were actually attempts to make a key for Windows 95.
But that was due to a mathematical limitation of ChatGPT, not some security feature.
It is interesting, however, to see that ChatGPT can be tricked into doing such a task.

A similar process could be used to bypass other guardrails.
In fact, chatbots don’t understand anything on an intellectual level.
Instead, they study conversations and data to generate responses that are likely to be the correct result.

After successfully using ChatGPT to generate Windows 95 keys, Enderman thanked the chatbot for its help.
Of course, the chatbothadgenerated keys, but again, it doesn’t understand things.









