Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Introduction
Safe MMDeploy Rust wrapper. This project aims to provide a Rust wrapper for MMDeploy>=1.0.0.
News
- (2024.12.24) Bump to MMDeploy v1.1.0.
- (2022.9.29) This repo has been added to the OpenMMLab ecosystem.
- (2022.9.27) This repo has been added to the MMDeploy CI.
Prerequisites
To make sure the building of this repo successful, you should install some pre-packages.
The following guidance is tested on Ubuntu OS on x86 device.
Step 0. Install Rust if you don't have.
|
Step 1. Install Clang and Rust required by Bindgen
.
Step 2. Download and install pre-built mmdeploy package. Currently, mmdeploy-sys
is built upon the pre-built package of mmdeploy
so this repo only supports OnnxRuntime and TensorRT backends. Don't be disappoint, the script of building from source is ongoing, and after finishing that we can deploy models with all backends supported by mmdeploy
in Rust.
If you want to deploy models with OnnxRuntime:
# Download and link to MMDeploy-onnxruntime pre-built package
# Download and link to TensorRT engine
# !!! Download TensorRT-8.2.3.0 CUDA 11.x tar package from NVIDIA, and extract it to the current directory. This link maybe helpful: https://developer.nvidia.com/nvidia-tensorrt-8x-download.
If you build MMDeploy SDK from source, then you should set MMDEPLOY_DIR and LD_LIBRARY_PATH as follows:
then you need to configure the path of TensorRT, ONNXRUNTIME, CUDA and cuDNN as follows:
Step 3. (Optional) Install OpenCV required by examples.
Step 4. (Optional) Download converted onnx models by mmdeploy-converted-models
.
Quickstart
Please read the previous section to make sure the required packages have been installed before using this crate.
Update your Cargo.toml
= "1.1.0"
APIs for MM Codebases
Good news: Now, you can use Rust language to build your fantastic applications powered by MMDeploy!
Take a look by running some examples! In these examples, CPU
is the default inference device. If you choose to deploy models on GPU
, you will replace all cpu
in test commands with cuda
.
Convert Models
You can
- Directly use converted models here _
- Or follow MMDeploy documentation to install and convert appropriate models
Classifier API
Deploy image classification models converted by MMDeploy.
The example deploys a ResNet model converted by the ONNXRUNTIME target on a CPU device.
Detector API
Deploy object detection models converted by MMDeploy.
The example deploys a FasterRCNN model converted by the ONNXRUNTIME target on a CPU device.
A rendered result we can take a look located in the current directory and is named output_detection.png
.
Segmentor API
Deploy object segmentation models converted by MMDeploy.
The example deploys a DeepLabv3 model converted by the ONNXRUNTIME target on a CPU device.
A rendered result we can take a look located in the current directory and is named output_segmentation.png
.
Pose detector API
Deploy pose detection models converted by MMDeploy.
The example deploys an HRNet model converted by the ONNXRUNTIME target on a CPU device.
A rendered result we can take a look located in the current directory and is named output_pose.png
.
Rotated detector API
Deploy rotated detection models converted by MMDeploy.
The example deploys a RetinaNet model converted by the ONNXRUNTIME target on a CPU device.
A rendered result we can take a look located in the current directory and is named output_rotated_detection.png
.
OCR API
Deploy text detection and text recognition models converted by MMDeploy.
The example deploys a DBNet model for detection and a CRNN model for recognition both converted by the ONNXRUNTIME target on a CPU device.
A rendered result we can take a look located in the current directory and is named output_ocr.png
.
Restorer API
Deploy restorer models converted by MMDeploy.
The example deploys an EDSR model for restoration converted by the ONNXRUNTIME target on a CPU device.
A rendered result we can take a look located in the current directory and is named output_restorer.png
.
TOSupport List
- Classifier
- Detector
- Segmentor
- Pose Detector
- Rotated Detector
- Text Detector
- Text Recognizer
- Restorer
TODO List
- PR for contributing a rust-mmdeploy-CI into MMDeploy
- Test with TensorRT prebuilt package
- Bump to the latest MMDeploy