fix: chunk large note reads to prevent output-too-large errors (fixes #5)

Add offset and max_chars parameters to obsidian_read_note:
- max_chars (default 50000, max 500000): caps characters returned per call
- offset (default 0): start position for reading, enabling pagination

When content is truncated a trailer message is appended telling the
caller the total size and the exact offset to pass on the next call.

This prevents the 26MB+ responses that caused Claude to reject output
when reading large PDFs stored in an Obsidian vault.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
This commit is contained in:
2026-04-17 17:36:33 -05:00
parent 82d2409fe3
commit a0801a82fd
2 changed files with 38 additions and 7 deletions

View File

@@ -104,11 +104,13 @@ export const createNoteSchema = z.object({
});
// Read note parameters
export const readNoteSchema = z.union([
z.object({ file: noteNameSchema }),
z.object({ path: filePathSchema }),
]).refine(
(data) => ('file' in data && data.file) || ('path' in data && data.path),
export const readNoteSchema = z.object({
file: noteNameSchema.optional(),
path: filePathSchema.optional(),
offset: z.number().int().nonnegative().optional().default(0),
max_chars: z.number().int().positive().max(500000).optional().default(50000),
}).refine(
(data) => data.file || data.path,
{ message: 'Either file or path must be provided' }
);