Fix memory consolidation truncation: set max_tokens=16384
All checks were successful
Build Nanobot OAuth / build (push) Successful in 5m40s
Build Nanobot OAuth / cleanup (push) Successful in 2s

Consolidation was failing because max_tokens defaulted to 4096,
causing Haiku's response to be truncated mid-JSON (finish_reason=max_tokens).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
wylab
2026-02-14 18:45:35 +01:00
parent 9136cca1ff
commit 84268edf01

View File

@@ -426,6 +426,7 @@ Respond with ONLY valid JSON, no markdown fences."""
],
model="claude-haiku-4-5",
thinking_budget=0,
max_tokens=16384,
)
text = (response.content or "").strip()
if text.startswith("```"):