Skip to content

Add native OpenAI runtime opt-in#26947

Draft
kitlangton wants to merge 5 commits into
llm-native-prepare-testsfrom
llm-native-runtime-openai
Draft

Add native OpenAI runtime opt-in#26947
kitlangton wants to merge 5 commits into
llm-native-prepare-testsfrom
llm-native-runtime-openai

Conversation

@kitlangton
Copy link
Copy Markdown
Contributor

@kitlangton kitlangton commented May 11, 2026

Summary

  • add OPENCODE_LLM_RUNTIME=native as an experimental session runtime selector
  • route no-tools OpenAI requests through native @opencode-ai/llm streaming while keeping AI SDK as the default
  • carry opencode provider auth/base URL/headers into native requests and cover the path with a local mocked OpenAI Responses stream test

Notes

  • native runtime is intentionally limited to OpenAI no-tools for this first slice
  • tool-enabled requests fail clearly instead of falling back silently

Test Plan

  • bun run test -- test/session/llm-native.test.ts test/session/llm.test.ts --test-name-pattern native
  • bun typecheck
  • bunx oxlint packages/opencode/src/session/llm.ts packages/opencode/src/session/llm-native.ts packages/opencode/test/session/llm.test.ts packages/opencode/test/session/llm-native.test.ts
  • git diff --check

Stack

  1. Consume native LLM events in session processing #26639
  2. Add native LLM request adapter #26941
  3. Compile native LLM requests in session tests #26946
  4. Add native OpenAI runtime opt-in #26947 👈 current
  5. Inject native LLM client into session service #27065

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant