.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/plot_train_convert_predict.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_plot_train_convert_predict.py: .. _l-logreg-example: Train, convert and predict with ONNX Runtime ============================================ This example demonstrates an end to end scenario starting with the training of a machine learned model to its use in its converted from. .. contents:: :local: Train a logistic regression +++++++++++++++++++++++++++ The first step consists in retrieving the iris datset. .. GENERATED FROM PYTHON SOURCE LINES 23-33 .. code-block:: default from sklearn.datasets import load_iris iris = load_iris() X, y = iris.data, iris.target from sklearn.model_selection import train_test_split X_train, X_test, y_train, y_test = train_test_split(X, y) .. GENERATED FROM PYTHON SOURCE LINES 34-35 Then we fit a model. .. GENERATED FROM PYTHON SOURCE LINES 35-41 .. code-block:: default from sklearn.linear_model import LogisticRegression clr = LogisticRegression() clr.fit(X_train, y_train) .. raw:: html
LogisticRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


.. GENERATED FROM PYTHON SOURCE LINES 42-44 We compute the prediction on the test set and we show the confusion matrix. .. GENERATED FROM PYTHON SOURCE LINES 44-49 .. code-block:: default from sklearn.metrics import confusion_matrix pred = clr.predict(X_test) print(confusion_matrix(y_test, pred)) .. rst-class:: sphx-glr-script-out .. code-block:: none [[15 0 0] [ 0 10 1] [ 0 0 12]] .. GENERATED FROM PYTHON SOURCE LINES 50-56 Conversion to ONNX format +++++++++++++++++++++++++ We use module `sklearn-onnx `_ to convert the model into ONNX format. .. GENERATED FROM PYTHON SOURCE LINES 56-65 .. code-block:: default from skl2onnx import convert_sklearn from skl2onnx.common.data_types import FloatTensorType initial_type = [("float_input", FloatTensorType([None, 4]))] onx = convert_sklearn(clr, initial_types=initial_type) with open("logreg_iris.onnx", "wb") as f: f.write(onx.SerializeToString()) .. GENERATED FROM PYTHON SOURCE LINES 66-68 We load the model with ONNX Runtime and look at its input and output. .. GENERATED FROM PYTHON SOURCE LINES 68-76 .. code-block:: default import onnxruntime as rt sess = rt.InferenceSession("logreg_iris.onnx", providers=rt.get_available_providers()) print("input name='{}' and shape={}".format(sess.get_inputs()[0].name, sess.get_inputs()[0].shape)) print("output name='{}' and shape={}".format(sess.get_outputs()[0].name, sess.get_outputs()[0].shape)) .. rst-class:: sphx-glr-script-out .. code-block:: none input name='float_input' and shape=[None, 4] output name='output_label' and shape=[None] .. GENERATED FROM PYTHON SOURCE LINES 77-78 We compute the predictions. .. GENERATED FROM PYTHON SOURCE LINES 78-87 .. code-block:: default input_name = sess.get_inputs()[0].name label_name = sess.get_outputs()[0].name import numpy pred_onx = sess.run([label_name], {input_name: X_test.astype(numpy.float32)})[0] print(confusion_matrix(pred, pred_onx)) .. rst-class:: sphx-glr-script-out .. code-block:: none [[15 0 0] [ 0 10 0] [ 0 0 13]] .. GENERATED FROM PYTHON SOURCE LINES 88-97 The prediction are perfectly identical. Probabilities +++++++++++++ Probabilities are needed to compute other relevant metrics such as the ROC Curve. Let's see how to get them first with scikit-learn. .. GENERATED FROM PYTHON SOURCE LINES 97-101 .. code-block:: default prob_sklearn = clr.predict_proba(X_test) print(prob_sklearn[:3]) .. rst-class:: sphx-glr-script-out .. code-block:: none [[3.34857930e-04 1.75161550e-01 8.24503592e-01] [2.10495002e-02 9.19659332e-01 5.92911677e-02] [9.74472714e-01 2.55271927e-02 9.31101356e-08]] .. GENERATED FROM PYTHON SOURCE LINES 102-104 And then with ONNX Runtime. The probabilies appear to be .. GENERATED FROM PYTHON SOURCE LINES 104-112 .. code-block:: default prob_name = sess.get_outputs()[1].name prob_rt = sess.run([prob_name], {input_name: X_test.astype(numpy.float32)})[0] import pprint pprint.pprint(prob_rt[0:3]) .. rst-class:: sphx-glr-script-out .. code-block:: none [{0: 0.00033485802123323083, 1: 0.1751614362001419, 2: 0.8245037198066711}, {0: 0.02104950323700905, 1: 0.9196593165397644, 2: 0.05929117649793625}, {0: 0.97447270154953, 1: 0.02552717924118042, 2: 9.311015247703835e-08}] .. GENERATED FROM PYTHON SOURCE LINES 113-114 Let's benchmark. .. GENERATED FROM PYTHON SOURCE LINES 114-132 .. code-block:: default from timeit import Timer def speed(inst, number=10, repeat=20): timer = Timer(inst, globals=globals()) raw = numpy.array(timer.repeat(repeat, number=number)) ave = raw.sum() / len(raw) / number mi, ma = raw.min() / number, raw.max() / number print("Average %1.3g min=%1.3g max=%1.3g" % (ave, mi, ma)) return ave print("Execution time for clr.predict") speed("clr.predict(X_test)") print("Execution time for ONNX Runtime") speed("sess.run([label_name], {input_name: X_test.astype(numpy.float32)})[0]") .. rst-class:: sphx-glr-script-out .. code-block:: none Execution time for clr.predict Average 4.79e-05 min=4.42e-05 max=7.36e-05 Execution time for ONNX Runtime Average 2.24e-05 min=2.16e-05 max=2.83e-05 2.244237500065083e-05 .. GENERATED FROM PYTHON SOURCE LINES 133-136 Let's benchmark a scenario similar to what a webservice experiences: the model has to do one prediction at a time as opposed to a batch of prediction. .. GENERATED FROM PYTHON SOURCE LINES 136-158 .. code-block:: default def loop(X_test, fct, n=None): nrow = X_test.shape[0] if n is None: n = nrow for i in range(0, n): im = i % nrow fct(X_test[im : im + 1]) print("Execution time for clr.predict") speed("loop(X_test, clr.predict, 100)") def sess_predict(x): return sess.run([label_name], {input_name: x.astype(numpy.float32)})[0] print("Execution time for sess_predict") speed("loop(X_test, sess_predict, 100)") .. rst-class:: sphx-glr-script-out .. code-block:: none Execution time for clr.predict Average 0.00442 min=0.0043 max=0.00603 Execution time for sess_predict Average 0.00104 min=0.00103 max=0.00108 0.0010441262300000176 .. GENERATED FROM PYTHON SOURCE LINES 159-160 Let's do the same for the probabilities. .. GENERATED FROM PYTHON SOURCE LINES 160-172 .. code-block:: default print("Execution time for predict_proba") speed("loop(X_test, clr.predict_proba, 100)") def sess_predict_proba(x): return sess.run([prob_name], {input_name: x.astype(numpy.float32)})[0] print("Execution time for sess_predict_proba") speed("loop(X_test, sess_predict_proba, 100)") .. rst-class:: sphx-glr-script-out .. code-block:: none Execution time for predict_proba Average 0.00643 min=0.0064 max=0.00664 Execution time for sess_predict_proba Average 0.00111 min=0.00109 max=0.00113 0.0011136213500003577 .. GENERATED FROM PYTHON SOURCE LINES 173-177 This second comparison is better as ONNX Runtime, in this experience, computes the label and the probabilities in every case. .. GENERATED FROM PYTHON SOURCE LINES 179-183 Benchmark with RandomForest +++++++++++++++++++++++++++ We first train and save a model in ONNX format. .. GENERATED FROM PYTHON SOURCE LINES 183-193 .. code-block:: default from sklearn.ensemble import RandomForestClassifier rf = RandomForestClassifier() rf.fit(X_train, y_train) initial_type = [("float_input", FloatTensorType([1, 4]))] onx = convert_sklearn(rf, initial_types=initial_type) with open("rf_iris.onnx", "wb") as f: f.write(onx.SerializeToString()) .. GENERATED FROM PYTHON SOURCE LINES 194-195 We compare. .. GENERATED FROM PYTHON SOURCE LINES 195-209 .. code-block:: default sess = rt.InferenceSession("rf_iris.onnx", providers=rt.get_available_providers()) def sess_predict_proba_rf(x): return sess.run([prob_name], {input_name: x.astype(numpy.float32)})[0] print("Execution time for predict_proba") speed("loop(X_test, rf.predict_proba, 100)") print("Execution time for sess_predict_proba") speed("loop(X_test, sess_predict_proba_rf, 100)") .. rst-class:: sphx-glr-script-out .. code-block:: none Execution time for predict_proba Average 0.699 min=0.697 max=0.702 Execution time for sess_predict_proba Average 0.00134 min=0.00131 max=0.00154 0.0013375981050003816 .. GENERATED FROM PYTHON SOURCE LINES 210-211 Let's see with different number of trees. .. GENERATED FROM PYTHON SOURCE LINES 211-240 .. code-block:: default measures = [] for n_trees in range(5, 51, 5): print(n_trees) rf = RandomForestClassifier(n_estimators=n_trees) rf.fit(X_train, y_train) initial_type = [("float_input", FloatTensorType([1, 4]))] onx = convert_sklearn(rf, initial_types=initial_type) with open("rf_iris_%d.onnx" % n_trees, "wb") as f: f.write(onx.SerializeToString()) sess = rt.InferenceSession("rf_iris_%d.onnx" % n_trees, providers=rt.get_available_providers()) def sess_predict_proba_loop(x): return sess.run([prob_name], {input_name: x.astype(numpy.float32)})[0] tsk = speed("loop(X_test, rf.predict_proba, 100)", number=5, repeat=5) trt = speed("loop(X_test, sess_predict_proba_loop, 100)", number=5, repeat=5) measures.append({"n_trees": n_trees, "sklearn": tsk, "rt": trt}) from pandas import DataFrame df = DataFrame(measures) ax = df.plot(x="n_trees", y="sklearn", label="scikit-learn", c="blue", logy=True) df.plot(x="n_trees", y="rt", label="onnxruntime", ax=ax, c="green", logy=True) ax.set_xlabel("Number of trees") ax.set_ylabel("Prediction time (s)") ax.set_title("Speed comparison between scikit-learn and ONNX Runtime\nFor a random forest on Iris dataset") ax.legend() .. image-sg:: /auto_examples/images/sphx_glr_plot_train_convert_predict_001.png :alt: Speed comparison between scikit-learn and ONNX Runtime For a random forest on Iris dataset :srcset: /auto_examples/images/sphx_glr_plot_train_convert_predict_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none 5 Average 0.0507 min=0.0504 max=0.0513 Average 0.00105 min=0.00104 max=0.00107 10 Average 0.0849 min=0.0848 max=0.0849 Average 0.00107 min=0.00107 max=0.00109 15 Average 0.119 min=0.119 max=0.119 Average 0.00108 min=0.00107 max=0.0011 20 Average 0.153 min=0.153 max=0.153 Average 0.0011 min=0.00109 max=0.00112 25 Average 0.187 min=0.187 max=0.187 Average 0.00109 min=0.00108 max=0.00111 30 Average 0.221 min=0.221 max=0.221 Average 0.00112 min=0.00111 max=0.00115 35 Average 0.255 min=0.255 max=0.256 Average 0.00111 min=0.0011 max=0.00113 40 Average 0.289 min=0.289 max=0.289 Average 0.00113 min=0.00112 max=0.00116 45 Average 0.325 min=0.323 max=0.33 Average 0.00114 min=0.00113 max=0.00117 50 Average 0.357 min=0.357 max=0.357 Average 0.00117 min=0.00115 max=0.0012 .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 3 minutes 15.008 seconds) .. _sphx_glr_download_auto_examples_plot_train_convert_predict.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_train_convert_predict.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_train_convert_predict.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_