GitHub Copilot Will Train on Your Code by Default Starting April 24

GitHub has announced a significant change to its Copilot data usage policy: starting April 24, 2026, interaction data from Copilot Free, Pro, and Pro+ users will be used to train and improve AI models by default. Developers who want to keep their data private must manually opt out before the deadline.
What Is Changing
Under the updated policy, GitHub will collect and use a wide range of interaction data for AI model training, including:
- Accepted or modified code outputs
- Code inputs and snippets sent to Copilot
- Code context around cursor position
- Comments and documentation
- File names and repository structure
- Feature interaction patterns
- User feedback such as thumbs up and down ratings
GitHub's Chief Product Officer Mario Rodriguez stated that "real-world data equals smarter models," noting that Microsoft interaction data had already shown "meaningful improvements, including increased acceptance rates in multiple languages."
Who Is Affected
The policy change applies exclusively to individual plans:
- Copilot Free — affected
- Copilot Pro — affected
- Copilot Pro+ — affected
- Copilot Business — not affected
- Copilot Enterprise — not affected
This means enterprise customers retain their existing data protections, while individual developers and freelancers bear the burden of the new default.
How to Opt Out
Developers can disable data collection by navigating to their GitHub account settings:
- Go to Settings
- Select Copilot
- Scroll to the Privacy section
- Disable "Allow GitHub to use my data for AI model training"
Users who previously opted out of data sharing will have their preferences preserved automatically.
Developer Backlash
The announcement has sparked significant backlash across the developer community. On X (formerly Twitter), developers expressed frustration over the opt-out rather than opt-in approach.
"Opt-out isn't consent, it's good luck," wrote one commenter, highlighting concerns about the default-on nature of the policy. Others pointed out the asymmetry between enterprise and individual users: "Enterprise users excluded. Individual devs get to be the training data."
Privacy advocates have also raised concerns about the breadth of data being collected, noting that file names, repository structure, and navigation patterns go well beyond simple code completion data.
The Broader Context
This move comes as AI coding assistants compete aggressively for market share. Cursor, Windsurf, Cline, and Tabnine each have their own data policies, and comparisons are already circulating among developers evaluating which tool best respects their privacy.
The decision also echoes broader industry tensions around AI training data. As models require increasingly large and diverse datasets to improve, the line between product improvement and user exploitation continues to blur.
What Developers Should Do
If you use GitHub Copilot on an individual plan, review your privacy settings before April 24. The opt-out process takes under a minute but requires active action — doing nothing means your interaction data becomes part of future model training.
For teams and organizations, Copilot Business and Enterprise plans remain unaffected, but this policy shift may influence procurement decisions for companies evaluating AI coding tools.
Source: GitHub Blog
Discuss Your Project with Us
We're here to help with your web development needs. Schedule a call to discuss your project and how we can assist you.
Let's find the best solutions for your needs.