Skip to content

Commit e58174c

Browse files
authored
llama : bump max seq limit from 64 to 256 (#15916)
ggml-ci
1 parent b213fce commit e58174c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/llama-cparams.h

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44

55
#include <cstdint>
66

7-
#define LLAMA_MAX_SEQ 64
7+
#define LLAMA_MAX_SEQ 256
88

99
struct llama_cparams {
1010
uint32_t n_ctx; // context size used during inference

0 commit comments

Comments
 (0)