Skip to content

ONNX Runtime: cross-platform, high performance scoring engine for ML models

License

Notifications You must be signed in to change notification settings

intel/onnxruntime

This branch is 8 commits behind microsoft/onnxruntime:main.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

2a09f27 · Apr 24, 2025
Feb 7, 2025
Jul 22, 2022
Jun 15, 2023
Apr 22, 2025
Oct 30, 2024
Sep 1, 2024
Apr 12, 2025
Apr 24, 2025
Apr 23, 2025
Feb 11, 2025
Apr 23, 2025
Apr 24, 2025
Apr 12, 2025
Apr 23, 2025
Dec 3, 2024
Apr 24, 2025
Jan 31, 2025
Jul 22, 2024
Feb 5, 2025
Apr 24, 2025
Apr 19, 2025
Aug 23, 2023
Sep 30, 2022
Nov 25, 2020
Jul 22, 2024
Dec 7, 2023
Feb 27, 2025
Jan 16, 2025
Feb 23, 2024
Dec 17, 2024
Feb 28, 2023
Nov 8, 2024
Feb 12, 2021
Sep 4, 2024
Mar 1, 2022
Nov 25, 2024
May 25, 2022
Mar 21, 2025
Apr 23, 2025
Aug 11, 2023
Jul 17, 2023
Jan 17, 2024
Nov 10, 2022
Jun 7, 2024
Oct 30, 2024
Apr 18, 2025
Dec 17, 2024
Mar 22, 2021
Feb 14, 2025
Sep 26, 2023
Jun 27, 2024
Apr 24, 2025

Repository files navigation

ONNX Runtime is a cross-platform inference and training machine-learning accelerator.

ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. Learn more →

ONNX Runtime training can accelerate the model training time on multi-node NVIDIA GPUs for transformer models with a one-line addition for existing PyTorch training scripts. Learn more →

Get Started & Resources

Builtin Pipeline Status

System Inference Training
Windows Build Status
Build Status
Build Status
Build Status
Linux Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Build Status
Mac Build Status
Android Build Status
iOS Build Status
Web Build Status
Other Build Status

This project is tested with BrowserStack.

Third-party Pipeline Status

System Inference Training
Linux Build Status

Releases

The current release and past releases can be found here: https://github.com/microsoft/onnxruntime/releases.

For details on the upcoming release, including release dates, announcements, features, and guidance on submitting feature requests, please visit the release roadmap: https://onnxruntime.ai/roadmap.

Data/Telemetry

Windows distributions of this project may collect usage data and send it to Microsoft to help improve our products and services. See the privacy statement for more details.

Contributions and Feedback

We welcome contributions! Please see the contribution guidelines.

For feature requests or bug reports, please file a GitHub Issue.

For general discussion or questions, please use GitHub Discussions.

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

License

This project is licensed under the MIT License.

About

ONNX Runtime: cross-platform, high performance scoring engine for ML models

Resources

License

Security policy

Citation

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 89.3%
  • Python 3.3%
  • C 2.5%
  • C# 1.0%
  • Cuda 0.9%
  • Assembly 0.7%
  • Other 2.3%