Skip to content
Draft
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
17 commits
Select commit Hold shift + click to select a range
1c38084
feat(github): add GITHUB_MODEL env var, show resolved model, surface …
LoackyBit Apr 24, 2026
de4c1b2
feat: update GitHub provider to support GITHUB_MODEL environment vari…
LoackyBit Apr 25, 2026
db1f9f3
Merge pull request #1 from LoackyBit/feature/github-model-env-var
LoackyBit Apr 25, 2026
65c00e1
Merge branch 'Gitlawb:main' into main
LoackyBit Apr 25, 2026
c451548
feat: implement dynamic model fetching from GitHub Models API with pr…
LoackyBit Apr 25, 2026
144cc6e
Merge pull request #2 from LoackyBit/feature/github-model-env-var
LoackyBit Apr 25, 2026
b3e1a61
chore(github): remove obsolete claude pseudo-models for github
LoackyBit Apr 25, 2026
d1c1829
feat(github): integrate GitHub model fetching and caching mechanisms
LoackyBit Apr 26, 2026
ee6f94c
refactor: add support for GitHub Models Azure endpoint and implement …
LoackyBit Apr 27, 2026
20599b2
refactor: replace hardcoded Copilot models with dynamic fetching from…
LoackyBit Apr 27, 2026
3f8d867
chore(file): remove local prompt file
LoackyBit Apr 28, 2026
6d02223
feat(copilot): implement github copilot suggestions
LoackyBit Apr 28, 2026
9cc8902
chore(sync): sync PR with upstream
LoackyBit Apr 28, 2026
b717f8f
Merge branch 'main' into main
LoackyBit Apr 28, 2026
a42bf48
fix(models): fix github model list by retrieving code from commit 205…
LoackyBit Apr 29, 2026
29130a8
fix(suggestion): implement @Meetpatel006 suggestions
LoackyBit Apr 29, 2026
faaa5e6
Filter models by model_picker_enabled state
LoackyBit May 1, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@
# Option 4 — GitHub Models:
# CLAUDE_CODE_USE_GITHUB=1
# GITHUB_TOKEN=ghp_your-token-here
# GITHUB_MODEL=auto (optional; default: auto for 10% discount on Copilot API)
#
Comment thread
LoackyBit marked this conversation as resolved.
# Option 5 — Ollama (local):
# CLAUDE_CODE_USE_OPENAI=1
Expand Down Expand Up @@ -188,6 +189,8 @@ ANTHROPIC_API_KEY=sk-ant-your-key-here
# -----------------------------------------------------------------------------
# CLAUDE_CODE_USE_GITHUB=1
# GITHUB_TOKEN=ghp_your-token-here
# GITHUB_MODEL=auto # optional; default: auto for 10% discount on Copilot API
# GITHUB_BASE_URL=https://models.github.ai/inference # optional; default: api.githubcopilot.com

Comment thread
LoackyBit marked this conversation as resolved.

# -----------------------------------------------------------------------------
Expand Down
137 changes: 137 additions & 0 deletions prompt.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,137 @@
You are a coding agent working on the repository Gitlawb/openclaude.

## Context

This pull request (https://github.com/Gitlawb/openclaude/pull/897) represents multiple failed attempts over several commits to implement a working GitHub Copilot model list. The branch is messy: it contains debug scripts, half-working shims, conflicting env var logic, and leftover test files accumulated across many iterations. None of the current implementations fully work.

You will operate directly on the branch of PR #897. Your job is NOT to add more code on top of what is there — it is to perform a clean surgical replacement: delete everything that was added across all commits of this PR related to Copilot model fetching, and replace it with a single correct implementation ported faithfully from anomalyco/opencode.

If the opencode system works correctly end-to-end (which it does), nothing from the PR's previous attempts should survive.

DO NOT commit, push, or open any pull request. All changes must remain local only.

Comment thread
LoackyBit marked this conversation as resolved.
Outdated
---

## Objective

Replace the entire Copilot model-fetching system introduced across all commits of PR #897 with a faithful port of the implementation used by anomalyco/opencode (https://github.com/anomalyco/opencode), which correctly fetches the list of available models based on the user's Copilot plan at runtime.

---

## Step 0 — Audit the branch before writing any code

Before touching any file:

1. Run `git log main..HEAD --oneline` to get the full list of commits in this PR
2. Run `git diff main...HEAD --name-only` to get the full list of files modified or added across the entire PR
3. For each modified/added file, read it and determine: is any part of it related to the Copilot model-fetching system? If yes, mark it for cleanup
4. Print a full audit report listing:
- Files to delete entirely (added within the PR, no longer needed)
- Files to partially revert (existed before the PR, modified by it)
- Files to leave untouched

Only after completing and printing the audit should you start making changes.

---

## Step 1 — Delete all previous attempts

Remove or fully revert every file and code change introduced across the PR commits that relates to Copilot model fetching, provider shims, or model normalization. This includes but is not limited to:

- Any propagation of `GITHUB_MODEL` into `OPENAI_MODEL`
- Any `isGithubNewEndpointModel()` or similar routing functions
- All `console.log` / `console.error` debug statements added in any file
- Temporary or test scripts: `list_models.ts`, `m.ts`, `test_api_version.ts`, or any file that exists only for debugging — delete them entirely
- Any `normalizeGithubCopilotModel`, `normalizeGithubModelsApiModel`, or similar normalization functions that patch model names inconsistently
- Any static fallback model lists hardcoded in any file
- Any env var hacks (`OPENAI_BASE_URL`, `OPENAI_MODEL`, `GITHUB_MODEL`) used to drive model selection for Copilot

If a file was added entirely within this PR and serves no purpose after the cleanup, delete it completely.
If a file existed before this PR but was modified, revert only the modifications introduced by the PR — leave the original file intact.

Do not leave dead code, commented-out blocks, or TODO comments from previous attempts.

---

## Step 2 — Implement the opencode system

### Reference implementation

Read both files in full before writing any code:
- https://github.com/anomalyco/opencode/blob/main/packages/opencode/src/plugin/github-copilot/models.ts
- https://github.com/anomalyco/opencode/blob/main/packages/opencode/src/plugin/github-copilot/copilot.ts

### 2a. Create `src/github-copilot/models.ts`

Implement the following exactly as opencode does:

**Zod schema** validating the response from `GET {baseURL}/models`:
- `data[].id` (string)
- `data[].name` (string)
- `data[].version` (string)
- `data[].model_picker_enabled` (boolean)
- `data[].supported_endpoints` (string[], optional)
- `data[].policy.state` (string, optional)
- `data[].capabilities.family` (string)
- `data[].capabilities.limits.max_context_window_tokens` (number)
- `data[].capabilities.limits.max_output_tokens` (number)
- `data[].capabilities.limits.max_prompt_tokens` (number)
- `data[].capabilities.limits.vision` (optional: max_prompt_image_size, max_prompt_images, supported_media_types)
- `data[].capabilities.supports.tool_calls` (boolean)
- `data[].capabilities.supports.vision` (boolean, optional)
- `data[].capabilities.supports.streaming` (boolean)
- `data[].capabilities.supports.structured_outputs` (boolean, optional)
- `data[].capabilities.supports.adaptive_thinking` (boolean, optional)
- `data[].capabilities.supports.reasoning_effort` (string[], optional)
- `data[].capabilities.supports.max_thinking_budget` (number, optional)
- `data[].capabilities.supports.min_thinking_budget` (number, optional)

**`build(key, remote, url, prev?)` function** mapping raw API item to internal `Model`:
- `providerID` → `"github-copilot"`
- `api.id` → `remote.id`
- `api.url` → `${url}/v1` if `remote.supported_endpoints` includes `"/v1/messages"`, else `url`
- `api.npm` → `"@ai-sdk/anthropic"` if messages endpoint, else `"@ai-sdk/github-copilot"`
- `reasoning` derived from: `adaptive_thinking`, `reasoning_effort.length > 0`, or `max_thinking_budget`/`min_thinking_budget` defined
- `image` derived from: `supports.vision === true` OR any item in `limits.vision.supported_media_types` starts with `"image/"`
- When `prev` is provided: preserve `name`, `family`, `release_date`, `options`, `headers`, `variants` — remote API wins only for limits and capabilities
- All `cost` fields → `0` (Copilot is subscription-based)

**`get(baseURL, headers, existing)` async function**:
- Fetches `${baseURL}/models` with `AbortSignal.timeout(5_000)`
- Throws descriptive error if `res.ok === false`
- Parses with Zod schema
- Filters: keep only `model_picker_enabled === true` AND `policy?.state !== "disabled"`
- Merges with `existing`: prune removed models, update existing ones, add new ones
- Returns `Record<string, Model>`

### 2b. Wire into the Copilot provider

In the Copilot provider initialization file (read the codebase to find the correct one):
- After obtaining the Copilot API token, call `get(baseURL, authHeaders, existingModels)` from `models.ts`
- Use `https://api.githubcopilot.com` as `baseURL` (the function appends `/models` itself)
- Pass `Authorization: Bearer <token>` and `Copilot-Integration-Id: vscode-chat` headers (adjust integration ID if openclaude uses a different one)
- Store the result as the canonical model list — never override it with env vars

---

## Constraints

- No `process.env.OPENAI_MODEL` or `process.env.OPENAI_BASE_URL` for Copilot model selection
- No hardcoded model IDs anywhere — the list is always fetched dynamically
- No static fallback model list — if the fetch fails, throw and let the caller handle it
- No `console.log`, no debug output, no `@ts-ignore`, no `any`
- No new npm dependencies — use only what is already in the project (`zod` must already be present)
- Do not add tests unless the project already has a test suite for this module

---

## Deliverable

When done, print a summary report with:
1. List of files deleted
2. List of files reverted (with a brief description of what was removed)
3. List of files created or modified for the new implementation
4. The exact local command the user should run to manually test the Copilot model fetch before proceeding to the next step

Do NOT run `git add`, `git commit`, `git push`, or open any PR.
Leave all changes as unstaged working-tree modifications so the user can inspect them freely.
42 changes: 40 additions & 2 deletions src/commands/model/model.test.tsx
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
import { afterEach, expect, mock, test } from 'bun:test'

import { getAdditionalModelOptionsCacheScope } from '../../services/api/providerConfig.js'
import { getAPIProvider } from '../../utils/model/providers.js'

const originalEnv = {
CLAUDE_CODE_USE_OPENAI: process.env.CLAUDE_CODE_USE_OPENAI,
Expand Down Expand Up @@ -56,7 +55,8 @@ test('opens the model picker without awaiting local model discovery refresh', as

expect(getAdditionalModelOptionsCacheScope()).toBe('openai:http://127.0.0.1:8080/v1')

const { call } = await import('./model.js')
const nonce = `${Date.now()}-${Math.random()}`
const { call } = await import(`./model.js?ts=${nonce}`)
const result = await Promise.race([
call(() => {}, {} as never, ''),
new Promise(resolve => setTimeout(() => resolve('timeout'), 50)),
Expand All @@ -66,3 +66,41 @@ test('opens the model picker without awaiting local model discovery refresh', as

expect(result).not.toBe('timeout')
})

test('awaits GitHub model refresh before opening picker when cache is empty', async () => {
process.env.CLAUDE_CODE_USE_GITHUB = '1'
delete process.env.CLAUDE_CODE_USE_OPENAI
delete process.env.CLAUDE_CODE_USE_GEMINI
delete process.env.CLAUDE_CODE_USE_MISTRAL
delete process.env.CLAUDE_CODE_USE_BEDROCK
delete process.env.CLAUDE_CODE_USE_VERTEX
delete process.env.CLAUDE_CODE_USE_FOUNDRY

let resolveRefresh: (() => void) | undefined
const refreshGithubModelsCache = mock(
() =>
new Promise<void>(resolve => {
resolveRefresh = resolve
}),
)

mock.module('../../utils/model/githubModels.js', () => ({
getCachedGithubModelOptions: () => [],
refreshGithubModelsCache,
}))

const nonce = `${Date.now()}-${Math.random()}`
const { call } = await import(`./model.js?ts=${nonce}`)
const pendingCall = call(() => {}, {} as never, '')
const result = await Promise.race([
pendingCall,
new Promise(resolve => setTimeout(() => resolve('timeout'), 50)),
])

expect(result).toBe('timeout')

resolveRefresh?.()
await pendingCall

expect(refreshGithubModelsCache).toHaveBeenCalled()
})
23 changes: 23 additions & 0 deletions src/commands/model/model.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ import { checkOpus1mAccess, checkSonnet1mAccess } from '../../utils/model/check1
import type { ModelOption } from '../../utils/model/modelOptions.js';
import { discoverOpenAICompatibleModelOptions } from '../../utils/model/openaiModelDiscovery.js';
import { getAPIProvider } from '../../utils/model/providers.js';
import { getCachedGithubModelOptions, refreshGithubModelsCache } from '../../utils/model/githubModels.js';
import { getActiveOpenAIModelOptionsCache, setActiveOpenAIModelOptionsCache } from '../../utils/providerProfiles.js';
import { getDefaultMainLoopModelSetting, isOpus1mMergeEnabled, renderDefaultModelSetting } from '../../utils/model/model.js';
import { isModelAllowed } from '../../utils/model/modelAllowlist.js';
Expand Down Expand Up @@ -301,6 +302,23 @@ async function refreshOpenAIModelOptionsCache(): Promise<void> {
// Keep /model usable even if endpoint discovery fails.
}
}

async function refreshGithubModelOptionsCache(): Promise<void> {
if (getAPIProvider() !== 'github') {
return
}

if (getCachedGithubModelOptions().length > 0) {
return
}

try {
await refreshGithubModelsCache()
} catch {
// Keep /model usable even if Copilot model discovery fails.
}
}

export const call: LocalJSXCommandCall = async (onDone, _context, args) => {
args = args?.trim() || '';
if (COMMON_INFO_ARGS.includes(args)) {
Expand All @@ -324,6 +342,11 @@ export const call: LocalJSXCommandCall = async (onDone, _context, args) => {
if (getAdditionalModelOptionsCacheScope()?.startsWith('openai:')) {
void refreshOpenAIModelOptionsCache();
}

if (getAPIProvider() === 'github') {
await refreshGithubModelOptionsCache()
}

return <ModelPickerWrapper onDone={onDone} />;
};
function renderModelLabel(model: string | null): string {
Expand Down
38 changes: 21 additions & 17 deletions src/components/ProviderManager.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ import {
readGithubModelsToken,
readGithubModelsTokenAsync,
} from '../utils/githubModelsCredentials.js'
import { refreshGithubModelsCache } from '../utils/model/githubModels.js'
import {
probeAtomicChatReadiness,
probeOllamaGenerationReadiness,
Expand Down Expand Up @@ -138,9 +139,8 @@ const FORM_STEPS: Array<{
]

const GITHUB_PROVIDER_ID = '__github_models__'
const GITHUB_PROVIDER_LABEL = 'GitHub Models'
const GITHUB_PROVIDER_DEFAULT_MODEL = 'github:copilot'
const GITHUB_PROVIDER_DEFAULT_BASE_URL = 'https://models.github.ai/inference'
const GITHUB_PROVIDER_LABEL = 'GitHub Copilot'
const GITHUB_PROVIDER_DEFAULT_BASE_URL = 'https://api.githubcopilot.com'
const CODEX_OAUTH_PROVIDER_NAME = 'Codex OAuth'
const CODEX_OAUTH_PROVIDER_MODEL = 'codexplan'

Expand Down Expand Up @@ -216,9 +216,9 @@ function getGithubProviderModel(
processEnv: NodeJS.ProcessEnv = process.env,
): string {
if (isEnvTruthy(processEnv.CLAUDE_CODE_USE_GITHUB)) {
return processEnv.OPENAI_MODEL?.trim() || GITHUB_PROVIDER_DEFAULT_MODEL
return ''
}
return GITHUB_PROVIDER_DEFAULT_MODEL
return ''
}
Comment on lines 259 to 266
Copy link

Copilot AI Apr 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

getGithubProviderModel() now always returns an empty string, so GitHub provider summaries never show the configured model (and the function becomes dead code). If the intent is to hide the model, consider inlining/removing this helper; otherwise, return the resolved model from GITHUB_MODEL/OPENAI_MODEL/settings so the summary remains informative.

Copilot uses AI. Check for mistakes.

function getGithubProviderSummary(
Expand All @@ -233,7 +233,8 @@ function getGithubProviderSummary(
? 'token via env'
: 'no token found'
const activeSuffix = isActive ? ' (active)' : ''
return `github-models · ${GITHUB_PROVIDER_DEFAULT_BASE_URL} · ${getGithubProviderModel(processEnv)} · ${credentialSummary}${activeSuffix}`
const modelSummary = getGithubProviderModel(processEnv)
return `github-copilot · ${GITHUB_PROVIDER_DEFAULT_BASE_URL}${modelSummary ? ` · ${modelSummary}` : ''} · ${credentialSummary}${activeSuffix}`
}

function describeAtomicChatSelectionIssue(
Expand Down Expand Up @@ -572,6 +573,18 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
}
}, [refreshCodexOAuthCredentialState, refreshGithubProviderState])

React.useEffect(() => {
if (!githubProviderAvailable || githubCredentialSource === 'none') {
return
}

void refreshGithubModelsCache().catch(error => {
setErrorMessage(
`Could not load GitHub models: ${error instanceof Error ? error.message : String(error)}`,
)
})
}, [githubCredentialSource, githubProviderAvailable])

React.useEffect(() => {
if (screen !== 'select-ollama-model') {
return
Expand Down Expand Up @@ -755,14 +768,9 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {

setAppState(prev => ({
...prev,
Comment thread
LoackyBit marked this conversation as resolved.
mainLoopModel: GITHUB_PROVIDER_DEFAULT_MODEL,
mainLoopModelForSession: null,
}))
refreshProfiles()
setAppState(prev => ({
...prev,
mainLoopModel: GITHUB_PROVIDER_DEFAULT_MODEL,
}))
setStatusMessage(`Active provider: ${GITHUB_PROVIDER_LABEL}`)
setIsActivating(false)
returnToMenu()
Expand Down Expand Up @@ -840,7 +848,6 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
const { error } = updateSettingsForSource('userSettings', {
env: {
CLAUDE_CODE_USE_GITHUB: '1',
OPENAI_MODEL: GITHUB_PROVIDER_DEFAULT_MODEL,
OPENAI_API_KEY: undefined as any,
OPENAI_ORG: undefined as any,
OPENAI_PROJECT: undefined as any,
Expand All @@ -859,7 +866,6 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
}

process.env.CLAUDE_CODE_USE_GITHUB = '1'
process.env.OPENAI_MODEL = GITHUB_PROVIDER_DEFAULT_MODEL
delete process.env.OPENAI_API_KEY
delete process.env.OPENAI_ORG
delete process.env.OPENAI_PROJECT
Expand Down Expand Up @@ -889,7 +895,6 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
const { error } = updateSettingsForSource('userSettings', {
env: {
CLAUDE_CODE_USE_GITHUB: undefined as any,
OPENAI_MODEL: undefined as any,
OPENAI_BASE_URL: undefined as any,
OPENAI_API_BASE: undefined as any,
},
Expand All @@ -909,7 +914,6 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {

delete process.env.CLAUDE_CODE_USE_GITHUB
delete process.env[GITHUB_MODELS_HYDRATED_ENV_MARKER]
delete process.env.OPENAI_MODEL
delete process.env.OPENAI_API_KEY
delete process.env.OPENAI_ORG
delete process.env.OPENAI_PROJECT
Expand Down Expand Up @@ -1458,7 +1462,7 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
isGithubCredentialSourceResolved ? (
<Text dimColor>No provider profiles configured yet.</Text>
) : (
<Text dimColor>Checking GitHub Models credentials...</Text>
<Text dimColor>Checking GitHub Copilot credentials...</Text>
)
) : (
<>
Expand Down Expand Up @@ -1578,7 +1582,7 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
label: isGithubActive
? `${GITHUB_PROVIDER_LABEL} (active)`
: GITHUB_PROVIDER_LABEL,
description: `github-models · ${GITHUB_PROVIDER_DEFAULT_BASE_URL} · ${getGithubProviderModel()}`,
description: `github-copilot · ${GITHUB_PROVIDER_DEFAULT_BASE_URL}`,
})
}

Expand Down
Loading
Loading