llama.cpp/examples/parallel
Georgi Gerganov fa0e677820
llama : extend batch API to select which logits to output
2023-09-19 00:24:13 +03:00
..
CMakeLists.txt parallel : example for serving multiple users in parallel 2023-09-18 20:37:28 +03:00
parallel.cpp llama : extend batch API to select which logits to output 2023-09-19 00:24:13 +03:00