添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
相关文章推荐
儒雅的番茄  ·  SpringBoot接收LocalDateT ...·  2 月前    · 
谦虚好学的消防车  ·  HtmlUtil (hutool - ...·  3 月前    · 
精明的猴子  ·  Turborepo API ...·  6 月前    · 
坚韧的椅子  ·  Meaningful Code Tests ...·  6 月前    · 

ONNX Runtime is a cross-platform inference and training machine-learning accelerator .

ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. Learn more →

ONNX Runtime training can accelerate the model training time on multi-node NVIDIA GPUs for transformer models with a one-line addition for existing PyTorch training scripts. Learn more →

Get Started & Resources

General Information : onnxruntime.ai

Usage documentation and tutorials : onnxruntime.ai/docs

YouTube video tutorials : youtube.com/@ONNXRuntime

Upcoming Release Roadmap

Companion sample repositories :

  • ONNX Runtime Inferencing: microsoft/onnxruntime-inference-examples
  • ONNX Runtime Training: microsoft/onnxruntime-training-examples
  • Builtin Pipeline Status

    System Inference Training

    Data/Telemetry

    Windows distributions of this project may collect usage data and send it to Microsoft to help improve our products and services. See the privacy statement for more details.

    Contributions and Feedback

    We welcome contributions! Please see the contribution guidelines .

    For feature requests or bug reports, please file a GitHub Issue .

    For general discussion or questions, please use GitHub Discussions .

    Code of Conduct

    This project has adopted the Microsoft Open Source Code of Conduct . For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

    License

    This project is licensed under the MIT License .