Skip to content

fix(model_metadata): use /v1/props endpoint for llama.cpp context detection#2403

Merged
teknium1 merged 1 commit intomainfrom
hermes/hermes-31d7db3b
Mar 22, 2026
Merged

fix(model_metadata): use /v1/props endpoint for llama.cpp context detection#2403
teknium1 merged 1 commit intomainfrom
hermes/hermes-31d7db3b

Conversation

@teknium1
Copy link
Copy Markdown
Contributor

Cherry-pick of #2230 by @RufusLin.

Recent llama.cpp builds moved /props to /v1/props. Both detection paths (server type + context length) now try /v1/props first, falling back to /props for older builds.

All 94 model_metadata tests pass.

…ection

Recent versions of llama.cpp moved the server properties endpoint from
/props to /v1/props (consistent with the /v1 API prefix convention).

The server-type detection path and the n_ctx reading path both used the
old /props URL, which returns 404 on current builds. This caused the
allocated context window size to fall back to a hardcoded default,
resulting in an incorrect (too small) value being displayed in the TUI
context bar.

Fix: try /v1/props first, fall back to /props for backward compatibility
with older llama.cpp builds. Both paths are now handled gracefully.
@teknium1 teknium1 merged commit ec22635 into main Mar 22, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

1 participant