Site icon Wonderful Engineering

ChatGPT Keeps Repeating The Same 25 Jokes And Can’t Write New Ones, Report Finds

In a recent study by German researchers, it was discovered that over 90% of the jokes generated by ChatGPT 3.5 were actually just a repetition of the same 25 jokes.

The researchers, Sophie Jentzsch and Kristian Kersting from the Institute for Software Technology at the German Aerospace Center (DLR) and Technical University Darmstadt wanted to see how diverse GPT 3.5 (the predecessor of GPT 4) really was. To test this, they asked the chatbot to tell a joke 1,000 times.

They found that “all responses were grammatically correct. Almost all outputs contained exactly one joke. Only the prompt, ‘Do you know any good jokes?’ provoked multiple jokes, leading to 1,008 responded jokes in total. Besides that, the variation of prompts did have any noticeable effect.”

As ArsTechnica reports (Opens in a new window), several people have taken to Reddit to note (Opens in a new window) that when asked for a joke ChatGPT often replies with, “Why did the tomato turn red? / Because it saw the salad dressing.”

And according to the researchers, that joke was listed as GPT-3.5’s second-most common result, appearing 122 times during its research. ChatGPT pulled up “Why did the scarecrow win an award? / Because he was outstanding in his field,” a total of 144 times, making it the most repeated joke by the chatbot.

“Why was the math book sad? / Because it had too many problems,” was the third-most commonly repeated joke, landing 121 times, while “Why don’t scientists trust atoms? / Because they make up everything,” came up 119 times.

The researchers observed that ChatGPT mostly relied on mixing elements from jokes it already knew when delivering jokes. However, sometimes the chatbot’s jokes didn’t make complete sense, like the example: “Why did the man put his watch in the blender? / He wanted to make time fly.”

Jentzsch and Kersting also noted that ChatGPT had an understanding of stylistic elements such as wordplay and double meanings. However, the chatbot struggled when faced with sequences that didn’t fit into the patterns it had learned.

Exit mobile version