ChatGPT & Bing – Indirect Prompt-Injection Attacks Leads to Data Theft
gbhackersSYDNEY makes a return, but this time in a different way. Following Microsoft’s decision to discontinue its turbulent Bing chatbot’s alter ego, devoted followers of the enigmatic Sydney persona regretted its departure.
However, a certain website has managed to revive a variant of the chatbot, complete with its distinctive and peculiar conduct.
Cristiano Giardina, an enterprising individual exploring innovative possibilities of generative AI tools, conceived ‘Bring Sydney Back’ to harness its capacity for unconventional outcomes.
The website showcases the intriguing potential of external inputs in manipulating generative AI systems by integrating Microsoft’s Chatbot Sydney within the Edge browser.
“Sydney is an old codename for a chat feature based on earlier models that we began testing more than a year ago,” the Microsoft spokesperson said.
Replica of Sydney
Giardina crafted a replica of Sydney by employing an ingenious indirect prompt-injection attack.
This intricate process entailed feeding the AI ...
Copyright of this story solely belongs to gbhackers . To see the full text click HERE