Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Added offloading support FP8 attention (#1131)
* Added offloading support FP8 attention Signed-off-by: Selvaraj Anandaraj <[email protected]> * Update transformer_engine/pytorch/attention.py Co-authored-by: Kirthi Shankar Sivamani <[email protected]> Signed-off-by: Selvaraj Anandaraj <[email protected]> * Fix Signed-off-by: Kirthi Shankar Sivamani <[email protected]> --------- Signed-off-by: Selvaraj Anandaraj <[email protected]> Signed-off-by: Selvaraj Anandaraj <[email protected]> Signed-off-by: Kirthi Shankar Sivamani <[email protected]> Co-authored-by: Selvaraj Anandaraj <[email protected]> Co-authored-by: Kirthi Shankar Sivamani <[email protected]>
- Loading branch information