Skip to content

Commit 59c8dc3

Browse files
BenjaminBossanConzel
authored andcommitted
FIX Bug when merging negatively weighted adapters (#2918)
See #2796 (comment) There was a bug with handling negative weights when merging multiple LoRA adapters into a single one using add_weighted_adapter (#2811). It should now be handled. The accompanying tests did not reveal the error because they compared A and B separately, not after multiplying (the "delta weight").
1 parent 7f1a5c1 commit 59c8dc3

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

tests/test_custom_models.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,6 @@
6262
)
6363
from peft.tuners.lora.config import BdLoraConfig
6464
from peft.tuners import lora
65-
from peft.tuners.lora.config import BdLoraConfig
6665
from peft.tuners.tuners_utils import BaseTunerLayer
6766
from peft.utils import AuxiliaryTrainingWrapper, infer_device
6867

0 commit comments

Comments
 (0)