Tech »  Topic »  GitHub Copilot Chat Flaw Leaked Data From Private Repositories

GitHub Copilot Chat Flaw Leaked Data From Private Repositories


Legit Security has detailed a vulnerability in the GitHub Copilot Chat AI assistant that led to sensitive data leakage and full control over Copilot’s responses.

Combining a Content Security Policy (CSP) bypass with remote prompt injection, Legit Security’s Omer Mayraz was able to leak AWS keys and zero-day bugs from private repositories, and influence the responses Copilot provided to other users.

Copilot Chat is designed to provide code explanations and suggestions, and allows users to hide content from the rendered Markdown, using HTML comments.

A hidden comment would still trigger the usual pull request notification to the repository owner, but without displaying the content of the comment. However, the prompt is injected into other users’ context as well.

The hidden comments feature, Mayraz explains, allows a user to influence Copilot into displaying code suggestions to other users, including malicious packages.

Mayraz also discovered that he could craft prompts ...


Copyright of this story solely belongs to securityweek . To see the full text click HERE