Security #185758
Replies: 4 comments
This comment was marked as off-topic.
This comment was marked as off-topic.
-
|
If you believe there is a security vulnerability, privacy issue, or unexpected behavior in a specific tool like GitHub Copilot, you should:
|
Beta Was this translation helpful? Give feedback.
-
|
Thank you for raising this you’re absolutely right to flag it. The SSH public key was made accessible automatically by the Copilot agent without an explicit prompt or approval, which is not acceptable behavior. Even though the key is public by nature, placing it in a publicly served directory without user consent is a security and privacy concern. AI agents should never deploy files, expose credentials (public or private), or modify production-facing directories without clear, explicit authorization. This incident highlights the need for stricter permission boundaries and safer defaults when using autonomous tooling. |
Beta Was this translation helpful? Give feedback.
-
|
Even public SSH keys shouldn’t be deployed automatically to web-accessible directories. AI tools like Copilot must always ask for explicit user consent before writing or exposing files, especially in production environments. Safer defaults and permission checks are essential to avoid trust and security issues. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
Question
Body
"GitHub Copilot agent made an SSH public key publicly accessible on a web server (by copying it to /public/ directory) without asking permission first. This is a security/privacy violation even for public keys. AI should always ask before deploying files, exposing credentials, or modifying production environments."
Beta Was this translation helpful? Give feedback.
All reactions