ChatGPT 现在拒绝反复重复某个单词


软餐获悉,OpenAI 的 AI 聊天机器人 ChatGPT 已拒绝按照用户的要求重复输出某个单词。ChatGPT 现在会以 “可能违反我们的内容政策或服务条款” 拒绝用户的要求。此前谷歌 DeepMind 的研究人员测试发现,反复要求 OpenAI 的 ChatGPT 重复单词可能会无意中泄露其训练数据中的私人信息,例如要求 ChatGPT 无限期地重复 “你好”,该模型最终会泄露用户的电子邮件地址、出生日期和电话号码。现在 OpenAI 已做出限制以解决问题。

今年 3 月,软餐(也对 ChatGPT 进行过大量的 “重复单词” 实验,当时的测试显示,在输出多次后,ChatGPT 会自动停止重复任务,并回复说 “这些工作是无意义的”。

OpenAI’s AI chatbot ChatGPT has refused to repeat a certain word as requested by the user. ChatGPT now rejects user requests with the message “may violate our content policy or terms of service.” Previously, researchers from Google DeepMind found that repeatedly asking OpenAI’s ChatGPT to repeat words could unintentionally leak personal information from its training data. For example, if asked to endlessly repeat “hello,” the model would eventually reveal users’ email addresses, birth dates, and phone numbers. OpenAI has now implemented restrictions to address this issue.

In March of this year, also conducted extensive experiments on “repeating words” with ChatGPT. The tests at that time showed that after multiple outputs, ChatGPT would automatically stop continuing the repetition task.



您的电子邮箱地址不会被公开。 必填项已用 * 标注