Skip to content

Commit eac3b95

Browse files
Merge branch 'multi_lora_tpu_v1' into tpu_bgmv_optimisation
Signed-off-by: Akshat Tripathi <[email protected]>
2 parents e916dcb + 9af1d0d commit eac3b95

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/config.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2383,7 +2383,7 @@ class LoRAConfig:
23832383
lora_dtype: Optional[Union[torch.dtype, str]] = None
23842384
lora_extra_vocab_size: int = 256
23852385
lora_vocab_padding_size: ClassVar[int] = current_platform\
2386-
.lora_vocab_padding_size()
2386+
.lora_vocab_padding_size
23872387
long_lora_scaling_factors: Optional[tuple[float]] = None
23882388
bias_enabled: bool = False
23892389

0 commit comments

Comments
 (0)