ChatGPT makes big mistakes?
The simplest questions I can google are answered incorrectly.
According to ChatGPT, Günther Jauch is 65 years old (FALSE) and not only a presenter, but also a pop singer (FALSE).
Why can't this thing answer the simplest questions?
With me everything is true:
You
who’s günther jauch?
ChatGPT
Günther Jauch is a well-known German TV moderator and journalist. He was on the 13th. Born in Muenster in 1956 and worked in various TV formats during his career. He became known mainly by his moderation of the quiz show “Who becomes a millionaire?”, which became popular worldwide. Jauch has also participated in other TV shows and political discussions and is considered one of the most prestigious moderators on German television.
You
Is günther jauch hit singer?
ChatGPT
No, Günther Jauch isn’t known as a hit singer. His career focused mainly on the moderating of television shows and journalism. Although he is active in the entertainment industry, he is not known for his vocal career or music. His acquaintance comes mainly from his work as a TV moderator, in particular through the moderation of the quiz show “Who becomes a millionaire?”.
You
How old is günther jauch?
ChatGPT
Günther Jauch was born on 13 July 1956. As of 2024, he would be 68 years old.
He okay with me this is different from day to day
I always thought he was a hit singer. 😂 At least I didn’t want it to happen that he isn’t.
Because it’s still an AI in the development phase. I also rarely use Chat GPT as it has failed me to measure with some really stupid answers.
Because an AI does not reflect any facts to 100%, but rather tells a story that is most likely to appear for them, based on previously trained facts.
It is a fantasy that comes close to the truth, but is faulty.
That’s why, never trust a AI.
So, I thought we had learned how ChatGPT works and why it is not suitable for this type of question.
ChatGPT has read through a large amount of text based on which it now tries to write answers that look like they were written by a person. It is second-ranking whether these are actually correct, but most importantly that grammar and syntax are correct.
In principle, it is the same reason why ChatGPT cannot be a math – if you ask what is 1+1, then the right result comes out. It is also easy to guess, but it was also often in the training material. With huge formulas, it is no longer so easy to guess what is the right result.
So, but with others, the right result comes out, how can that be. Apart from the fact that your prompt might be unskillful, a certain susceptibility is added so that not always the same answer comes out, because GPT is basically deterministic, but what does not fit in the context of human language.
For example, age could be drawn from an outdated source in a rare case, which correctly claims he is 65, which is no longer true now because he is 68 – ChatGPT cannot think it is just trying to speak as we are. It doesn’t know who Günther Jauch is or what a hit singer is. So it has to rely on his read-in knowledge – and if there are ambiguous information that can be repeated incorrectly, then there are hallucinations – that is, that an LLM produces untruths without it knowing.
In addition, you probably asked the question in German and under 6% of the content in the CommonCrawl Dataset (i.e. the dataset on which ChatGPT was trained among others), which means ChatGPT is much worse in German anyway than in English.
So you should either provide yourself with material to use the LLM (which works best) or provide material by using a chatbot with internet access such as Bing Chat / Microsoft Copilot or Google Bard – or you pay hold for ChatGPT – or just by using its search engine – most major search engines are clever enough to display such information on smart cards. This is faster and more reliable.
Where he doesn’t have any answers to it, he’s making himself one!
https://www.youtube.com/watch?v=qc3VDE3Swak