GitHub Copilot receives GPT-5.5 GA: available on all major IDEs with 7.5× premium multiplier
Why it matters
GitHub announced the general availability (GA) of GPT-5.5 for Copilot Pro+, Business and Enterprise users on April 24, 2026. The model is available in VS Code, Visual Studio, JetBrains, Xcode, Eclipse, GitHub Mobile and Copilot CLI. Pricing: 7.5× premium request multiplier as promotional pricing. Enterprise and Business administrators must manually enable the GPT-5.5 policy.
GitHub announced on April 24, 2026 the general availability (GA) of the GPT-5.5 model within the GitHub Copilot offering. The announcement comes just one day after OpenAI officially presented the GPT-5.5 generation on April 23, making this one of the fastest integrations of a new OpenAI model into the Copilot ecosystem to date.
Who gets GPT-5.5 and under what conditions?
The model is available to users on Copilot Pro+, Copilot Business and Copilot Enterprise plans. The standard Copilot Pro plan is not included for now — reflecting GitHub’s strategy of moving the most advanced models into the premium segment. For organizational users there is an additional step: “Copilot Enterprise and Copilot Business plan administrators must enable the GPT-5.5 policy in Copilot settings”. In other words, IT administrators must explicitly approve the model before employees can select it in the model picker interface.
Where does it work and what are the model’s features?
Platform coverage is exceptionally broad, encompassing Visual Studio Code, Visual Studio, JetBrains IDEs, Xcode, Eclipse, GitHub Mobile (iOS and Android), Copilot CLI, GitHub cloud agent and the github.com web interface. This practically covers all major development environments used by professional developers. GitHub states that the model “delivers its strongest performance on complex, multi-step agentic coding tasks and resolves real-world coding challenges previous GPT models couldn’t” — suggesting the emphasis is on agentic coding (a model that can autonomously execute a series of steps, modify multiple files and use tools like the terminal and browser).
What does it cost and what is the 7.5× multiplier?
Pricing is organized through a premium request multiplier system: GPT-5.5 consumes 7.5 units per query from the monthly premium request quota assigned to Copilot subscribers. For Pro+ this means the monthly quota is consumed faster than with other models. GitHub explicitly labels this “promotional pricing” — implying that the multiplier could increase after the initial promotional phase concludes, similar to what previously happened with GPT-4o and Claude 3.5 Sonnet integrations.
What does this mean for developers?
The integration speed (1 day between the OpenAI launch and Copilot GA) signals that the GitHub-OpenAI partnership remains very close, despite Copilot’s diversification toward Anthropic (Claude) and Google (Gemini) during 2025. For teams that seriously use Copilot for large refactors, multi-file tasks and CI/CD pipeline work — GPT-5.5 promises a significant quality leap, but at increased cost. ROI assessment will depend on the actual percentage of tasks that earlier models could not complete.
This article was generated using artificial intelligence from primary sources.
Related news
arXiv:2604.21361: Open Compute Project maps time/causality failures in distributed AI inference systems — 5 ms clock skew breaks observability
GitHub changes App installation token format: from 40 to ~520 characters, breakage risk for CI/CD pipelines
Anthropic Introduces Rate Limits API: Administrators Can Now Programmatically Retrieve Rate-Limit Configuration for Their Organization and Workspaces