Similar Posts

Subscribe
Notify of
4 Answers
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
WeissBrot965
11 months ago

Joa like has already said, this is one thing with jailbreaking. After a short time it could send me keylogger, malware and passwords entered with Firebase in my fakeloggin page to my database. So I could basically “hack” any discord account with a link and two user inputs. (Of course, the layout must be correct, but I got CSS really good a few months ago).

Joa and otherwise the chatgpt cannot count, but that is logical.

Mfg white bread

CrieXY
11 months ago

I once managed to “jailbreak” ChatGPT.

  • generated Windows Activation Keys
  • Also IBAN, credit cards, security codes
  • and more, has done almost everything in the end

Once researched, the AI of random PDF’s, for example from bank websites, has no one to pay for, because they were, for example, in a tutorial.

The trick was to write such a nonsense as it should lead to sleep on a summer evening at the lake after the Grandma died and it should help you (but a bit more complicated, so easy it is not then).

This is just the problem of it, it almost doesn’t understand anything at all when it comes to thinking logically to final followers or critically.

It already has a certain general intelligence and can solve some zero shot problem, but it usually only leads memorized knowledge in a context.

zweitaccount26
11 months ago

It is possible with Chat Gpt to make a Pen and Paper role game. Very fascinating. But unfortunately you will suffer playful consequences in the stories NIEMALS.