Back to feed

b7913

Feb 3, 2026
Meta/llama.cppCLIvb7913

server: print actual model name in 'model not found" error (#19117)

Experimenting with AI, my environment gets messy fast and it's not always easy to know what model my software is trying to load. This helps with troubleshooting.

before:

Error: { code = 400, message = "model not found", type = "invalid_request_error" }

After:

Error: { code = 400, message = "model 'toto' not found", type = "invalid_request_error" }

macOS/iOS:

Linux:

Windows:

openEuler: