Skip to content

feat: workspace.get_llm() and get_secrets() for SaaS credential inheritance#2409

Draft
xingyaoww wants to merge 6 commits intomainfrom
feat/cloud-workspace-get-llm-secrets
Draft

feat: workspace.get_llm() and get_secrets() for SaaS credential inheritance#2409
xingyaoww wants to merge 6 commits intomainfrom
feat/cloud-workspace-get-llm-secrets

Conversation

@xingyaoww
Copy link
Collaborator

@xingyaoww xingyaoww commented Mar 13, 2026

Summary

Adds get_llm() and get_secrets() methods to OpenHandsCloudWorkspace, enabling SDK-created conversations to inherit the user's SaaS credentials.

Design

  • get_llm(**kwargs): Calls GET /api/v1/users/me?expose_secrets=true with both Bearer token and X-Session-API-Key headers (dual auth). Extracts llm_model, llm_api_key, llm_base_url and returns a fully usable LLM instance. User-provided kwargs override SaaS settings.
  • get_secrets(names=None): Calls GET /sandboxes/{id}/settings/secrets (X-Session-API-Key auth) for names only, then builds LookupSecret references with env_headers. Raw secret values never transit through the SDK client — they are resolved lazily by the agent-server inside the sandbox.

Security

get_llm() sends both the Bearer token (user identity) and the sandbox session key (active sandbox proof). The server verifies the session key belongs to a sandbox owned by the same user. This prevents a leaked API key from being used to extract raw LLM credentials.

Usage

with OpenHandsCloudWorkspace(...) as workspace:
    llm = workspace.get_llm()  # calls /users/me?expose_secrets=true
    agent = Agent(llm=llm, tools=get_default_tools())

    conversation = Conversation(agent=agent, workspace=workspace)
    conversation.update_secrets(workspace.get_secrets())  # LookupSecret refs
    conversation.send_message("Analyze this repo")

Companion PR

Tests

  • 8 workspace tests (get_llm, get_secrets: happy path, overrides, no-key, filtering, errors)

Resolves OpenHands/OpenHands#13268


Agent Server images for this PR

GHCR package: https://github.com/OpenHands/agent-sdk/pkgs/container/agent-server

Variants & Base Images

Variant Architectures Base Image Docs / Tags
java amd64, arm64 eclipse-temurin:17-jdk Link
python amd64, arm64 nikolaik/python-nodejs:python3.13-nodejs22 Link
golang amd64, arm64 golang:1.21-bookworm Link

Pull (multi-arch manifest)

# Each variant is a multi-arch manifest supporting both amd64 and arm64
docker pull ghcr.io/openhands/agent-server:78ff036-python

Run

docker run -it --rm \
  -p 8000:8000 \
  --name agent-server-78ff036-python \
  ghcr.io/openhands/agent-server:78ff036-python

All tags pushed for this build

ghcr.io/openhands/agent-server:78ff036-golang-amd64
ghcr.io/openhands/agent-server:78ff036-golang_tag_1.21-bookworm-amd64
ghcr.io/openhands/agent-server:78ff036-golang-arm64
ghcr.io/openhands/agent-server:78ff036-golang_tag_1.21-bookworm-arm64
ghcr.io/openhands/agent-server:78ff036-java-amd64
ghcr.io/openhands/agent-server:78ff036-eclipse-temurin_tag_17-jdk-amd64
ghcr.io/openhands/agent-server:78ff036-java-arm64
ghcr.io/openhands/agent-server:78ff036-eclipse-temurin_tag_17-jdk-arm64
ghcr.io/openhands/agent-server:78ff036-python-amd64
ghcr.io/openhands/agent-server:78ff036-nikolaik_s_python-nodejs_tag_python3.13-nodejs22-amd64
ghcr.io/openhands/agent-server:78ff036-python-arm64
ghcr.io/openhands/agent-server:78ff036-nikolaik_s_python-nodejs_tag_python3.13-nodejs22-arm64
ghcr.io/openhands/agent-server:78ff036-golang
ghcr.io/openhands/agent-server:78ff036-java
ghcr.io/openhands/agent-server:78ff036-python

About Multi-Architecture Support

  • Each variant tag (e.g., 78ff036-python) is a multi-arch manifest supporting both amd64 and arm64
  • Docker automatically pulls the correct architecture for your platform
  • Individual architecture tags (e.g., 78ff036-python-amd64) are also available if needed

Add methods to OpenHandsCloudWorkspace that call the new SaaS API
endpoints to retrieve the user's LLM configuration and custom secrets:

- get_llm(**llm_kwargs): Fetches LLM settings from the user's SaaS
  account and returns a configured LLM instance. User kwargs override
  SaaS defaults.
- get_secrets(names=None): Fetches custom secrets and returns a
  dict[str, str] compatible with conversation.update_secrets().

These methods enable SDK users to inherit their SaaS credentials while
retaining full control over agent customization.

Depends on OpenHands/OpenHands#13306 for the server-side API endpoints.

Related: OpenHands/OpenHands#13268

Co-authored-by: openhands <openhands@all-hands.dev>
@github-actions
Copy link
Contributor

github-actions bot commented Mar 13, 2026

API breakage checks (Griffe)

Result: Passed

Action log

@github-actions
Copy link
Contributor

github-actions bot commented Mar 13, 2026

Agent server REST API breakage checks (OpenAPI)

Result: Passed

Action log

…t-backed configs

SDK LLM changes:
- LLM.api_key now accepts str | SecretStr | SecretSource | None
- Validator passes through SecretSource instances; deserialises dicts
- Serializer delegates to SecretSource.model_dump() for round-tripping
- _get_litellm_api_key_value() resolves SecretSource.get_value() lazily
- _init_model_info_and_caps() skips network for SecretSource api_key

OpenHandsCloudWorkspace changes:
- get_llm() calls sandbox-scoped /settings/llm (SESSION_API_KEY auth)
  and returns LLM with api_key=LookupSecret — raw key never reaches client
- get_secrets() calls /settings/secrets for names, returns dict of
  LookupSecret instances pointing to per-secret endpoints
- Added _send_settings_request() for SESSION_API_KEY-authenticated calls

Co-authored-by: openhands <openhands@all-hands.dev>
@xingyaoww xingyaoww changed the title DRAFT: feat: add get_llm() and get_secrets() to OpenHandsCloudWorkspace feat: LLM api_key accepts SecretSource; workspace returns LookupSecret-backed configs Mar 13, 2026
@github-actions
Copy link
Contributor

Coverage

Coverage Report •
FileStmtsMissCoverMissing
openhands-sdk/openhands/sdk/llm
   llm.py4917983%420, 447, 500, 725, 831, 833–834, 862, 908, 919–921, 925–929, 937–939, 949–951, 954–955, 959, 961–962, 964, 1022, 1171–1172, 1369–1370, 1379, 1392, 1394–1399, 1401–1418, 1421–1425, 1427–1428, 1434–1443, 1494, 1496
openhands-workspace/openhands/workspace/cloud
   workspace.py22417521%91–93, 99, 102–103, 111, 115, 117–122, 130–131, 133, 136, 139–141, 145, 148–149, 152, 155–156, 160, 163–165, 168, 174, 176–178, 187–190, 200, 203, 208, 210–211, 213–215, 217, 219–220, 223–226, 228–229, 231–232, 234, 236–238, 241–242, 246–257, 261–262, 264–265, 273–274, 276–278, 280, 291, 304–305, 307–310, 314, 317–318, 321–323, 325–333, 335, 339–340, 342–345, 347–348, 354–356, 358–365, 374, 382, 409, 411–412, 414–415, 420–426, 429, 431, 466, 468–469, 471, 474, 476–481, 487, 493–494, 496–498, 500–508, 510, 513, 516, 519
TOTAL20024586070% 

LookupSecret now supports env_headers — a mapping of header name to
environment variable name.  Headers are resolved from os.environ at
get_value() call time.  This ensures the SESSION_API_KEY is never
embedded in the serialized LookupSecret; only the env var *name*
travels over the wire.  Resolution only succeeds inside the sandbox
where the env var is set.

- LookupSecret: add env_headers field, merge into headers in get_value()
- LLM: add assert to narrow dict type for pyright
- Workspace: use env_headers instead of raw headers in get_llm/get_secrets
- Tests: 3 new env_headers enforcement tests (28 total SDK tests)
- Fix pyright errors in examples and tests

Co-authored-by: openhands <openhands@all-hands.dev>
Per feedback, LookupSecret is only needed for secrets — not LLM config.
Reverted all LLM.api_key SecretSource changes (api_key stays str|SecretStr|None).
workspace.get_llm() now returns a real LLM with the raw api_key.
Deleted test_llm_secret_source_api_key.py (no longer applicable).

- LLM class: reverted to main (no SecretSource support)
- workspace.get_llm(): builds LLM from plain JSON response
- workspace.get_secrets(): still uses LookupSecret+env_headers
- 8 workspace tests pass

Co-authored-by: openhands <openhands@all-hands.dev>
@xingyaoww xingyaoww changed the title feat: LLM api_key accepts SecretSource; workspace returns LookupSecret-backed configs feat: workspace.get_llm() and get_secrets() for SaaS credential inheritance Mar 13, 2026
…ings/llm

The OpenHands app server already has GET /api/v1/users/me which returns
full user settings including llm_model, llm_api_key, llm_base_url.
A new expose_secrets query param returns the raw api_key.

get_llm() now calls /users/me?expose_secrets=true via _send_api_request
(Bearer token auth) instead of /settings/llm via _send_settings_request
(X-Session-API-Key auth), eliminating the need for a dedicated LLM
settings endpoint.

Co-authored-by: openhands <openhands@all-hands.dev>
The server now requires a valid session key when expose_secrets=true.
get_llm() includes the sandbox session key header alongside the
Bearer token, proving there is an active sandbox owned by the caller.

Co-authored-by: openhands <openhands@all-hands.dev>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: SDK-created conversations should inherit SaaS settings (credentials, repo context)

2 participants