Skip to content

fix(provider): allow remote Ollama without OPENAI_API_KEY (#369)#952

Open
0xfandom wants to merge 1 commit intoGitlawb:mainfrom
0xfandom:fix/369-remote-ollama-no-api-key
Open

fix(provider): allow remote Ollama without OPENAI_API_KEY (#369)#952
0xfandom wants to merge 1 commit intoGitlawb:mainfrom
0xfandom:fix/369-remote-ollama-no-api-key

Conversation

@0xfandom
Copy link
Copy Markdown
Contributor

Summary

  • Remote Ollama base URLs (a host outside the loopback / RFC1918 range, or a domain like ollama.corp.example.com) were being rejected at startup with OPENAI_API_KEY is required when CLAUDE_CODE_USE_OPENAI=1 and OPENAI_BASE_URL is not local.
  • Ollama doesn't need an API key; the user had to invent a phantom value just to clear validation.
  • Extended getProviderValidationError to bypass the API-key requirement when the base URL looks like Ollama (port 11434 on any host, or ollama substring in hostname/pathname). isLikelyOllamaEndpoint already encoded these heuristics for tool-call gating in providerConfig.ts; exported it and reused so the rules stay in one place.

Impact

  • user-facing impact: users can now point OPENAI_BASE_URL at a remote Ollama server (http://ollama.corp.example.com/v1, http://203.0.113.5:11434/v1, etc.) without setting OPENAI_API_KEY, matching the existing local-Ollama UX.
  • developer/maintainer impact: isLikelyOllamaEndpoint is now exported. No call-site changes elsewhere.

Testing

  • bun run build
  • bun run smoke
  • focused tests: bun test src/utils/providerValidation.test.ts — 10 pass, 4 new

Notes

  • provider/model path tested: CLAUDE_CODE_USE_OPENAI=1 with non-local Ollama-shaped base URLs (my-ollama-server.example.com:11434, 203.0.113.5:11434, ollama.corp.example.com) — all clear validation; non-Ollama remote (api.openai.com) still requires the key.
  • screenshots attached: n/a (validation message change only)
  • follow-up work / known limitations: same heuristic semantics as isLikelyOllamaEndpoint already used for tool-call gating; if the matching surface needs to be tighter (e.g. require both port AND host hint), the change would land in providerConfig.ts and benefit both call sites.

Fixes #369

gnanam1990
gnanam1990 previously approved these changes Apr 30, 2026
Copy link
Copy Markdown
Collaborator

@gnanam1990 gnanam1990 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! Scope is tight, fix addresses the root cause, and reusing the existing isLikelyOllamaEndpoint heuristic is the right call — keeps the rules in one place. Tests cover hostname, port, and the negative (non-Ollama remote still requires the key). CI green. LGTM.

@0xfandom
Copy link
Copy Markdown
Contributor Author

0xfandom commented May 1, 2026

Any update on this @kevincodex1

@0xfandom 0xfandom force-pushed the fix/369-remote-ollama-no-api-key branch 2 times, most recently from cbeb8e5 to 8cfe33b Compare May 4, 2026 08:35
@0xfandom
Copy link
Copy Markdown
Contributor Author

0xfandom commented May 4, 2026

Rebased onto current main at 8cfe33b. #979 (Hicap) modified src/services/api/providerConfig.ts which we also touch (the isLikelyOllamaEndpoint export); rebase applied cleanly — Hicap added new descriptor wiring further down the file, our export function isLikelyOllamaEndpoint change at line 373 is untouched.

Same fix as before: 47 insertions, 4 deletions across providerConfig.ts + providerValidation.ts + providerValidation.test.ts. bun test src/utils/providerValidation.test.ts → 23/23.

@gnanam1990 — re-approval ping if it still looks good.

@0xfandom 0xfandom requested a review from gnanam1990 May 4, 2026 18:14
Remote Ollama servers (host outside the loopback / RFC1918 range, or
on a domain like ollama.corp.example.com) hit the OPENAI provider
validation gate that requires OPENAI_API_KEY whenever the base URL is
not local. Ollama doesn't need an API key, so the user has to invent a
phantom value to get past startup.

Extend the bypass to recognise likely-Ollama base URLs in addition to
local URLs:
- port 11434 (Ollama default) on any host
- 'ollama' substring in hostname or pathname

isLikelyOllamaEndpoint already encoded these heuristics for tool-call
gating in providerConfig.ts; export it and reuse so the rules stay in
one place.

Fixes Gitlawb#369
@0xfandom 0xfandom force-pushed the fix/369-remote-ollama-no-api-key branch from 8cfe33b to 81e5825 Compare May 6, 2026 06:43
@0xfandom
Copy link
Copy Markdown
Contributor Author

0xfandom commented May 6, 2026

Rebased onto current main (post 0.9.2 release). Drift cleared with no conflicts — only src/services/api/providerConfig.ts + src/utils/providerValidation.{ts,test.ts} changed.

bun test src/utils/providerValidation.test.ts
→ 23 pass / 0 fail
bun run build → green

No logic changes since the prior approval.

@0xfandom
Copy link
Copy Markdown
Contributor Author

0xfandom commented May 7, 2026

Any update on this @kevincodex1 @gnanam1990

Copy link
Copy Markdown
Collaborator

@Vasanthdev2004 Vasanthdev2004 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Full review of the current head (81e5825).

Verdict: Approve-ready

What I checked:

  • Reviewed the provider validation change and confirmed the no-key bypass is limited to local provider URLs or URLs matching the existing isLikelyOllamaEndpoint() heuristic.
  • Confirmed the OpenAI remote-provider path still requires OPENAI_API_KEY for normal non-Ollama endpoints.
  • Checked that the heuristic is centralized by exporting the existing provider-config helper instead of duplicating rules.
  • Reviewed the added regression tests for remote Ollama host/port cases and the non-Ollama remote negative case.

Validation I ran locally:

  • bun test src/utils/providerValidation.test.ts

I do not see a remaining blocker on the current head.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Running with remote Ollama support

3 participants