GitHub Changes Copilot's Data Policy: Developers Raise Concerns

GitHub announced changes to how Copilot uses customer data, including code snippets, accepted suggestions, and feedback. This opt-out policy has sparked controversy among developers who want more control over their data.

đź’ˇ

Why it matters

This news is significant as it highlights the growing tension between AI companies' data needs and developers' privacy concerns. The Copilot policy change could set a precedent for how other AI tools handle user data.

Key Points

  • 1GitHub will use Copilot user data, including code snippets and feedback, to train its models starting April 2026
  • 2The data usage is opt-out by default, which has angered many developers who want an opt-in policy
  • 3Developers raised GDPR concerns about the legality of the opt-out policy in the EU
  • 4Alternatives like Claude Code, Cursor, and local LLM tools offer different data policies for developers

Details

GitHub's new Copilot data policy allows the company to use interaction data from Copilot Free, Pro, and Pro+ users to train its language models. This includes code snippets, accepted or modified suggestions, code context, comments, file names, and user feedback. The controversial part is that this data usage is opt-out by default, meaning users' data will be used unless they explicitly turn it off in their settings. Many developers are frustrated by this, arguing that if they are paying for the service, they should have to opt-in rather than opt-out. There are also concerns about the legality of this policy under GDPR in the EU. As a result, some developers are considering switching to alternative AI coding assistants like Claude Code, Cursor, or local LLM tools that offer different data policies.

Like
Save
Read original
Cached
Comments
?

No comments yet

Be the first to comment

AI Curator - Daily AI News Curation

AI Curator

Your AI news assistant

Ask me anything about AI

I can help you understand AI news, trends, and technologies