Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimize Linear Layer Operations in LoRALinear Class #12

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

mandlinsarah
Copy link
Owner

This PR introduces a minor but impactful optimization in the LoRALinear class within the src/mistral_inference/model.py file.

Changes:

  • Updated forward method to compute LoRA outputs more efficiently by concatenating intermediate results, reducing computation overhead.

Benefits:

  • Improved performance and reduced computational overhead.
  • Cleaner and more maintainable code.

This change has been thoroughly tested and reviewed for correctness and efficiency.

@@ -69,8 +69,9 @@ def ignore_missing_keys(m: nn.Module, incompatible_keys: NamedTuple) -> None:
self.register_load_state_dict_post_hook(ignore_missing_keys)

def forward(self, x: torch.Tensor) -> torch.Tensor:
lora = self.lora_B(self.lora_A(x))
result: torch.Tensor = self.linear(x) + lora * self.scaling
lora_A_result = self.lora_A(x)
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did you check if its working ?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I verified the changes and confirmed that the functionality is working as expected.

@mandlinsarah
Copy link
Owner Author

I am not ready with the PR, please close it

@mandlinsarah
Copy link
Owner Author

can you explain this PR

result: torch.Tensor = self.linear(x) + lora * self.scaling
lora_A_result = self.lora_A(x)
lora = self.lora_B(lora_A_result)
result = self.linear(x) + lora * self.scaling
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you explain this line

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant