Skip to content

Add LayerNorm support #74

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

gcunhase
Copy link

Issue

Failed to convert ONNX model with LayerNormalization layer (opset>=17):

File "onnx2pytorch/onnx2pytorch/convert/operations.py", line 287, in convert_operations
    raise NotImplementedError(
NotImplementedError: Conversion not implemented for op_type=LayerNormalization.

How to repro

Run:

# 1. Export simple ONNX model with LayerNorm
import torch
from torch import nn
block = nn.LayerNorm([128])
input_tensor = torch.randn(1, 32, 128, 128)
onnx_path = "model_ln.onnx"
torch.onnx.export(
    block,
    input_tensor,
    onnx_path,
    input_names=['input'],
    output_names=['output'],
    opset_version=17
)

# 2. Convert ONNX model back to PyTorch
import onnx
import onnx2pytorch
onnx_model = onnx.load(onnx_path)
pytorch_model = onnx2pytorch.ConvertModel(onnx_model)  # -> Error occurs here

# 3. Recover ONNX model and check they're the same
torch.onnx.export(
    pytorch_model,
    input_tensor,
    onnx_path.replace(".onnx", "_recovered.onnx"),
    input_names=['input'],
    output_names=['output'],
    opset_version=17
)

@gcunhase gcunhase changed the title Add LayerNorm support Draft: Add LayerNorm support Jul 17, 2025
@gcunhase gcunhase changed the title Draft: Add LayerNorm support Add LayerNorm support Jul 17, 2025
@gcunhase
Copy link
Author

Debugging error with dynamic calculation in LN op, moving this to draft for now.

@gcunhase
Copy link
Author

Fixed, ready for review.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant