A collection of thoughts, tutorials, and write-ups.
Explore the security risks of Prompt Injection, a vulnerability that manipulates LLM inputs, leading to unintended and potentially harmful responses. This part will only cover the reconnaissance.
Writeup for FINCII2024 STEGANOGRAPHY DEEP SEA
Writeup for FINCII2024 OSINT MALVERTISING
Writeup for FINCII2024 BOOT2ROOT
Writeup for Hack The Box LoveTok