Wednesday, August 8, 2018

[ONNX] Use ONNX_TF and nGraph_ONNX to do inference/prediction with ONNX model


Here I try to use the pre-trained model from ONNX model zoo, which the models are already converted from some deep learning framework. So I download the Resnet50 model from the following URL and untar it:

wget https://s3.amazonaws.com/download.onnx/models/opset_8/resnet50.tar.gz
tar -xzvf resnet50.tar.gz 
P.S: pre-trained ONNX models: https://github.com/onnx/models

Then, I can do the inference/prediction using this ONNX model in two ways:


1. Use onnx_tf:
The backend is TensorFlow so that there is a lot of information of running TensorFlow

import onnx
import numpy as np
from onnx_tf.backend import prepare

model = onnx.load('resnet50/model.onnx')
tf_rep = prepare(model)

# Load or create an image
import numpy as np
picture = np.ones([1, 3, 224, 224])
output = tf_rep.run(picture)
print("The image is classified as ", np.argmax(output))

>>> The image is classified as 512


2. Use ngraph_onnx:
The backend is nGraph and it seems faster than TensorFlow.
import onnx
onnx_protobuf = onnx.load('resnet50/model.onnx')

from ngraph_onnx.onnx_importer.importer import import_onnx_model
ng_models = import_onnx_model(onnx_protobuf)
print(ng_models)

import ngraph as ng
ng_model = ng_models[0]
runtime = ng.runtime(backend_name='CPU')
resnet = runtime.computation(ng_model['output'], *ng_model['inputs'])

import numpy as np
picture = np.ones([1, 3, 224, 224], dtype=np.float32)
#resnet(picture)
print("The image is classified as ", np.argmax(resnet(picture)))

>>> The image is classified as 512
P.S: for using and installing ngraph_onnx, you can also refer to these URLs:
https://ngraph.nervanasys.com/index.html/howto/import.html

https://github.com/NervanaSystems/ngraph-onnx




No comments: