Since its launch at the end of last year, ChatGPT has gained immense popularity, with people using AI for various purposes, such as drafting work reports, devising diet plans, and even job applications.
However, there are five tasks that the bot cannot perform, as explored by MailOnline.
Unable to remember its own name
ChatGPT is an incredibly versatile tool that can perform a wide range of tasks. However, there is one thing it appears to be unaware of – its own identity.
If ChatGPT is mentioned, it will claim to have no knowledge of such a thing. This is probably because its developers had not yet assigned a name to the AI when it was first created. Despite this, it is somewhat surprising that such a sophisticated system would not be aware of its own name.
Provide information after 2021
ChatGPT’s training data has a cutoff point of 2021, which means that it is entirely oblivious to any developments or events that have occurred since that time. As an offline system, it has no internet access to stay informed on current trends, news, or other recent happenings.
For instance, it has no knowledge of world leaders who have taken office after 2021, so it would still think that Boris Johnson is the current UK prime minister.
It can’t play Wordle
Wordle, a popular online word game, might seem like a perfect match for ChatGPT as a language model. However, the AI is completely unaware of the game and lacks any comprehension of its rules or objective. This is mainly due to Wordle’s emergence in popularity happening after 2021, beyond ChatGPT’s training data.
Although some attempts have been made to train the AI to understand the game, ChatGPT still cannot play Wordle or achieve any significant results within it.
Not able to write accurate news articles
While it can produce decent introductions, it lacks the nuanced understanding necessary to create entire articles on its own. Tests have revealed that ChatGPT is quite adept at writing, but its output often contains errors due to its tendency to generate fictitious information when it lacks the requisite knowledge
In all fairness, the bot does provide a warning that it “may occasionally generate incorrect information” and “may produce harmful instructions or biased content.”
Give advice on prescription medication
ChatGPT made headlines in the medical field when it successfully passed the rigorous US medical licensing exam, leading some to speculate that it could potentially replace human physicians.
Despite this achievement, the AI should not be relied upon for advice on prescription medication. While it can offer general medical guidance, if asked about prescription drugs, it will defer to a professional and advise users to seek their advice instead.