design(agent-server): REST LLM profiles for conversation start and restore#2485
Draft
design(agent-server): REST LLM profiles for conversation start and restore#2485
Conversation
Co-authored-by: openhands <openhands@all-hands.dev>
Contributor
|
📁 PR Artifacts Notice This PR contains a
|
Contributor
Python API breakage checks — ✅ PASSEDResult: ✅ PASSED |
Contributor
REST API breakage checks (OpenAPI) — ✅ PASSEDResult: ✅ PASSED |
enyst
commented
Mar 19, 2026
enyst
commented
Mar 19, 2026
enyst
commented
Mar 19, 2026
Co-authored-by: openhands <openhands@all-hands.dev>
ACPAgent uses a sentinel agent.llm and ACP-native model selection, so mirroring the llm_profile start/switch flow on ACP contracts would be a no-op. Scope the design doc to standard conversations and call out ACP as a separate future design space. Co-authored-by: openhands <openhands@all-hands.dev>
Accept the review suggestion to keep the agent-server default LLM profile store aligned with the SDK's existing ~/.openhands/profiles location, and clarify that OH_LLM_PROFILES_PATH is still available for explicit overrides. Co-authored-by: openhands <openhands@all-hands.dev>
Explain the agent-server start-model inheritance explicitly and add the tradeoffs for placing llm_profile_id on the shared/base ACP path versus on the standard start request plus StoredConversation. Co-authored-by: openhands <openhands@all-hands.dev>
enyst
commented
Mar 19, 2026
Co-authored-by: openhands <openhands@all-hands.dev>
Co-authored-by: openhands <openhands@all-hands.dev>
Co-authored-by: openhands <openhands@all-hands.dev>
enyst
commented
Mar 20, 2026
enyst
commented
Mar 22, 2026
Rework the design doc so A1 reuses the existing LLMProfileStore directly for conversation start, switch, and restore. Treat /api/llm-profiles, cipher-aware profile writes, and an explicit profile-store path override as optional follow-up layers rather than new required server abstractions. Co-authored-by: openhands <openhands@all-hands.dev>
Rewrite the design so A1 clearly requires /api/llm-profiles CRUD for REST clients while still reusing the existing LLMProfileStore as the only backing store. Clarify that conversation start, switch, and restore all use that same store, and keep the no-redundant-store constraint explicit. Co-authored-by: openhands <openhands@all-hands.dev>
13 tasks
Collaborator
|
[Automatic Post]: It has been a while since there was any activity on this PR. @enyst, are you still working on it? If so, please go ahead, if not then please request review, close it, or request that someone else follow up. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
/api/llm-profilesas REST CRUD over the existingLLMProfileStoreinstead of reviving per-conversation/llmContext investigated
docs/llm_switching.mdLLMProfileStore,LocalConversation.switch_profile(), and agent-server start/restore flowArtifact
.pr/design-llm-profiles-agent-server.mdRecommendation in brief
/api/llm-profilesCRUD as a thin wrapper over the existingLLMProfileStorellm_profile_idRefs #1451
Agent Server images for this PR
• GHCR package: https://github.com/OpenHands/agent-sdk/pkgs/container/agent-server
Variants & Base Images
eclipse-temurin:17-jdknikolaik/python-nodejs:python3.13-nodejs22golang:1.21-bookwormPull (multi-arch manifest)
# Each variant is a multi-arch manifest supporting both amd64 and arm64 docker pull ghcr.io/openhands/agent-server:08aad25-pythonRun
All tags pushed for this build
About Multi-Architecture Support
08aad25-python) is a multi-arch manifest supporting both amd64 and arm6408aad25-python-amd64) are also available if needed