Back to feed

b7716

Jan 13, 2026
Meta/llama.cppCLIvb7716

server : add arg for disabling prompt caching (#18776)

  • server : add arg for disabling prompt caching

Disabling prompt caching is useful for clients who are restricted to sending only OpenAI-compat requests and want deterministic responses.

  • address review comments

  • address review comments

macOS/iOS:

Linux:

Windows:

openEuler: