Kneronnxopt is the ONNX optimizer project for kneron hardware platforms. Its purpose is to provide shapes for all the tensors as well as accelerate the inference and compiling process. Currently, we support ONNX up to opset 18.

1. Preparation

Before using the tool, you need to activate the conda environment for it. Required packages are already installed in the environment. You can activate the environment by running the following command:

conda activate onnx1.13

2. Usage

The tool is under /workspace/libs/kneronnxopt. You can use the following command to run the tool:

python /workspace/libs/kneronnxopt/kneronnxopt/ -o <output_onnx_model> <input_onnx_model>

It also has the following optional arguments:

3. Notes

This tool is still under development. If you have any questions, please feel free to contact us.

This tool automatically update the model opset to 18. This process has no good way to reverse. Please use other tools is you do not want to upgrade your model opset.

If you want to cut the model, please use onnx.utils.extract_model from ONNX. Please check