Simple Hacking Technique Can Extract ChatGPT Training Data

Apparently all it takes to get a chatbot to start spilling its secrets is prompting it to repeat certain words like "poem" forever.

Mar 17, 2025 - 09:43
 0
Simple Hacking Technique Can Extract ChatGPT Training Data
Apparently all it takes to get a chatbot to start spilling its secrets is prompting it to repeat certain words like "poem" forever.