We have hosted the application megengine in order to run this application in our online workstations with Wine or directly.


Quick description about megengine:

MegEngine is a fast, scalable and easy-to-use deep learning framework with 3 key features. You can represent quantization/dynamic shape/image pre-processing and even derivation in one model. After training, just put everything into your model and inference it on any platform at ease. Speed and precision problems won't bother you anymore due to the same core inside. In training, GPU memory usage could go down to one-third at the cost of only one additional line, which enables the DTR algorithm. Gain the lowest memory usage when inferencing a model by leveraging our unique pushdown memory planner. NOTE: MegEngine now supports Python installation on Linux-64bit/Windows-64bit/MacOS(CPU-Only)-10.14+/Android 7+(CPU-Only) platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through Windows Subsystem for Linux (WSL) or install the Windows distribution directly. Many other platforms are supported for inference.

Features:
  • Inference fast and high-precision on x86/Arm/CUDA/RoCM
  • Support Linux/Windows/iOS/Android/TEE
  • Unified core for both training and inference
  • Lowest hardware requirements helped by algorithms
  • Inference efficiently on all-platform
  • Save more memory and optimize speed by leveraging advanced usage


Programming Language: C++.
Categories:
Frameworks, Machine Learning, Deep Learning Frameworks

Page navigation:

©2024. Winfy. All Rights Reserved.

By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.