Expose ChatGPT System Prompt and File Leakage Explained reveals prompt injection commands.
🔗 Recommended Read: 'Assessing Prompt Injection Risks in 200+ Custom GPTs' by Northwestern University for an in-depth understanding of prompt hacking risks.
https://arxiv.org/pdf/2311.11538.pdf
🛡️ Protect Your Custom GPT: Learn from cybersecurity experts on implementing strong security practices to shield your GPT from unauthorized access.
https://7h30th3r0n3.fr/in-depth-under...
0:00- Intro
0:24 - Custom GPTS
0:50 - Northwestern University Study
2:00 - System Prompt Extraction
3:22 - File Leakage
4:30 - Secret Letters GPTs