diff --git a/AGENTS.md b/AGENTS.md index 98a3bdb..a953055 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -28,7 +28,7 @@ * **LTM injection pipeline: system transform → forSession → formatKnowledge → gradient deduction**: LTM injected via experimental.chat.system.transform hook. getLtmBudget() computes ceiling as (contextLimit - outputReserved - overhead) \* ltmFraction (default 10%, configurable 2-30%). forSession() loads project-specific entries unconditionally + cross-project entries scored by term overlap, greedy-packs into budget. formatKnowledge() renders as markdown. setLtmTokens() records consumption so gradient deducts it. Key: LTM goes into output.system (system prompt) — invisible to tryFit(), counts against overhead budget. -* **Monorepo structure: @loreai/core + opencode-lore packages with Bun workspaces**: Lore uses a Bun workspace monorepo with two packages: \`packages/core/\` (\`@loreai/core\`) contains all runtime-agnostic logic (db, ltm, gradient, search, temporal, distillation, curator, etc.) with a barrel \`src/index.ts\`. \`packages/opencode/\` (\`opencode-lore\`) contains the OpenCode plugin entry (\`src/index.ts\`), recall tool (\`src/reflect.ts\`), scripts, eval harness, and tests that depend on \`@opencode-ai/plugin\`. Root \`package.json\` is private with \`workspaces: \["packages/\*"]\`. Tests run via \`bun test\` from root with preload at \`packages/core/test/setup.ts\` (configured in \`bunfig.toml\`). \`tsconfig.base.json\` at root, each package extends it. OpenCode package depends on \`@loreai/core\` as \`workspace:\*\`. +* **Monorepo structure: @loreai/core + opencode-lore packages with Bun workspaces**: Lore uses a Bun workspace monorepo with two packages: \`packages/core/\` (\`@loreai/core\`) contains all runtime-agnostic logic (db, ltm, gradient, search, temporal, distillation, curator, etc.) with a barrel \`src/index.ts\`. \`packages/opencode/\` (\`opencode-lore\`) contains the OpenCode plugin entry (\`src/index.ts\`), recall tool (\`src/reflect.ts\`), scripts, eval harness, and tests that depend on \`@opencode-ai/plugin\`. Root \`package.json\` is private with \`workspaces: \["packages/\*"]\` but MUST have \`main\` and \`exports\` pointing to \`./packages/opencode/src/index.ts\` — this trampoline is required because OpenCode's \`file:///\` plugin loader resolves from the repo root. Without it, plugin loading silently fails (no entry point found). Tests run via \`bun test\` from root with preload at \`packages/core/test/setup.ts\`. * **OpenCode plugin SDK has no embedding API — vector search blocked**: The OpenCode plugin SDK (\`@opencode-ai/plugin\`, \`@opencode-ai/sdk\`) exposes only session/chat/tool operations. There is no \`client.embed()\`, embeddings endpoint, or raw model inference API. The only LLM access is \`client.session.prompt()\` which creates full chat roundtrips through the agentic loop. This means Lore cannot do vector/embedding search without either: (1) OpenCode adding an embedding API, or (2) direct \`fetch()\` to provider APIs bypassing the SDK (fragile — requires key extraction from \`client.config.providers()\`). The FTS5 + RRF search infrastructure is designed to be additive — vector search would layer on top as another RRF input list, not replace BM25.