Skip to content

refactor(core): decouple from @opencode-ai/sdk with host-agnostic types and LLMClient#76

Merged
BYK merged 1 commit intomainfrom
refactor/host-abstraction
Apr 18, 2026
Merged

refactor(core): decouple from @opencode-ai/sdk with host-agnostic types and LLMClient#76
BYK merged 1 commit intomainfrom
refactor/host-abstraction

Conversation

@BYK
Copy link
Copy Markdown
Owner

@BYK BYK commented Apr 18, 2026

Summary

Fully decouples @loreai/core from the OpenCode SDK. The core package now has zero dependency on @opencode-ai/sdk or @opencode-ai/plugin — it defines its own types and interfaces.

New abstractions

Lore types (packages/core/src/types.ts)

Type Purpose
LoreMessage LoreUserMessage | LoreAssistantMessage — discriminated on .role
LorePart LoreTextPart | LoreReasoningPart | LoreToolPart | LoreGenericPart — with isTextPart(), isReasoningPart(), isToolPart() type guards
LoreMessageWithParts { info: LoreMessage; parts: LorePart[] } — the unit that hooks operate on
LLMClient Single-method interface: .prompt(system, user, opts?)Promise<string | null>

OpenCode adapter (packages/opencode/src/llm-adapter.ts)

createOpenCodeLLMClient() implements LLMClient by wrapping client.session.create() + client.session.prompt() with the existing agent-not-found retry logic.

What moved

  • promptWorker() session lifecycle → packages/opencode/src/llm-adapter.ts
  • ensureWorkerSession() per module → deleted (adapter handles session lifecycle)
  • 10 OpenCode-specific promptWorker tests → deleted (logic lives in adapter now)
  • @opencode-ai/sdk devDep removed from core's package.json

Zero behavior change

  • Tests: 350 pass (8 fewer — removed OpenCode-specific mocks, added 2 tracking tests)
  • Both packages typecheck clean
  • Core barrel exports all new types + guards for downstream consumption

Why this matters

@loreai/core can now be consumed by any host that implements LLMClient:

  • OpenCode plugin → wraps SDK (today)
  • Pi extension → wraps complete() from @mariozechner/pi-ai (next PR)
  • Standalone → direct fetch() to provider APIs (future)

…art types and LLMClient

Introduces host-agnostic types and interfaces so @loreai/core can run
under any host (OpenCode, Pi, standalone), not just OpenCode:

Types (packages/core/src/types.ts):
- LoreMessage = LoreUserMessage | LoreAssistantMessage (discriminated on .role)
- LorePart = LoreTextPart | LoreReasoningPart | LoreToolPart | LoreGenericPart
  with isTextPart(), isReasoningPart(), isToolPart() type guards
- LoreMessageWithParts = { info: LoreMessage; parts: LorePart[] }
- LLMClient interface with a single .prompt(system, user, opts?) method

What changed:
- temporal.ts + gradient.ts: import Lore types instead of @opencode-ai/sdk
- gradient.ts: use type guard functions for safe narrowing
- distillation.ts + curator.ts + search.ts: accept LLMClient instead of
  OpenCode Client; removed ensureWorkerSession/workerSessions/promptWorker
  (OpenCode-specific session lifecycle now lives in the adapter)
- worker.ts: trimmed to just workerSessionIDs tracking + LLMClient re-export
- @opencode-ai/sdk removed from core's devDependencies entirely

OpenCode adapter (packages/opencode/src/llm-adapter.ts):
- createOpenCodeLLMClient() implements LLMClient by wrapping the OpenCode
  SDK's session.create + session.prompt with agent-not-found retry
  (logic extracted from the old core worker.ts)

Tests: 350 pass (down from 358 — 10 OpenCode-specific promptWorker tests
moved to adapter, 2 worker tracking tests added). No behavior change.
@BYK BYK enabled auto-merge (squash) April 18, 2026 19:22
@BYK BYK merged commit 7861b1d into main Apr 18, 2026
1 check passed
@BYK BYK deleted the refactor/host-abstraction branch April 18, 2026 19:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant