llama.cpp/examples/parallel
Georgi Gerganov 7e48e21b1f
examples : fix build after sampling refactoring
ggml-ci
2023-10-15 23:36:31 +03:00
..
CMakeLists.txt llama : custom attention mask + parallel decoding + no context swaps (#3228) 2023-09-28 19:04:36 +03:00
README.md llama : custom attention mask + parallel decoding + no context swaps (#3228) 2023-09-28 19:04:36 +03:00
parallel.cpp examples : fix build after sampling refactoring 2023-10-15 23:36:31 +03:00

README.md

llama.cpp/example/parallel

Simplified simluation for serving incoming requests in parallel