Similar Posts

Subscribe
Notify of
5 Answers
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
EinAlexander
9 months ago

what limits should be set?

Not really.

It would be conceivable to limit consultation on subjects which may only be carried out by qualified personnel.

But then you would have to ban any forum and any self-help group.

Alex

Benedikt581
9 months ago

Chatbots should not give medical advice, because only authorized doctors are allowed. Same situation in the healing of people, financial and legal advice.

bycranix
9 months ago
Reply to  Benedikt581

This will change in the future. ChatGPT is good at it now. And the same mistakes that make an AI make the human being more likely than the AI itself.

Benedikt581
9 months ago
Reply to  bycranix

According to German law, this is not legal. ChatGPT usually refers to a doctor in medical advice. What is even more important is: liability is excluded!

See Future: it is very unlikely that this will change.

Regarding errors: it is about the relevance. A person is responsible for his work. Whether these mistakes come from an AI doesn’t matter. It doesn’t even matter if the AI could, because

only approved doctors

MAckermann
9 months ago

There are 3 robot rules

  1. Always listen to people if it does not contradict Rule 2 or 3
  2. Do not hurt yourself
  3. No people hurt