Back to feed

b7713

Jan 12, 2026
Meta/llama.cppCLIvb7713

examples : add --kv-unified to batched example (#18774)

This commit adds the --kv-unified flag to the batched example. This flag is currently specified in the README.md as required, but is currently not available as a command line option for the batched example.

The motivation for this is that specifying this flag as the README instructs, will lead to an error about the flag not being recognized, and without this option the example fail with the following error:

split_equal: sequential split is not supported when there are coupled
sequences in the input batch (you may need to use the -kvu flag)
decode: failed to find a memory slot for batch of size 4
main: llama_decode() failed

macOS/iOS:

Linux:

Windows:

openEuler: