Runtimes for ONNX#

Python Runtime = ‘python’#

This module implements a python runtime for ONNX. It is a work constantly in progress. It was started to facilitate the implementation of scikit-learn converters in sklearn-onnx. Main class is OnnxInference.

<<<

import numpy
from sklearn.linear_model import LinearRegression
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from mlprodict.onnxrt import OnnxInference
from mlprodict.onnx_conv import to_onnx

iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, _ = train_test_split(X, y)
clr = LinearRegression()
clr.fit(X_train, y_train)

# predictions with scikit-learn
exp = clr.predict(X_test[:5])
print(exp)

# predictions with onnxruntime
model_def = to_onnx(clr, X_train.astype(numpy.float32),
                    target_opset=12)
oinf = OnnxInference(model_def)
y = oinf.run({'X': X_test[:5]})
print(y)

>>>

    [ 0.038 -0.03   1.124  1.988  0.184]
    {'variable': array([[ 0.038],
           [-0.03 ],
           [ 1.124],
           [ 1.988],
           [ 0.184]])}

Some ONNX operators converters are using were not all available in older version of ONNX. This version is called opset number. ONNX 1.4.0 is opset 9, ONNX 1.5.0 is opset 10… Next table shows which operator is available in which opset. An empty cell means it is not available. Other cells contains concatenated flags whose meaning is the following:

  • ERROR means the automated process failed to give a appropriate status or the runtime produces predictions too far from the original predictions, the second part of the constant gives an approximate diagnostic, last columns gives the exception message,

  • OK: the converter works fine and the runtime produces predictions almost equal to the orignal predictions, relative difference is below 1e-5,

  • e<%f: the converter works fine and the runtime produces predictions close to the orignal predictions, relative difference is below the threshold,

  • i/j: the model was converted for a specific opset but the converted ONNX is compatible with smaller opset, i is the smallest compatible opset for the main domain, j is the smallest compatible opset for the ai domain,

The model are tested through simple problems using the Iris dataset. The datasets is split into train test datasets. Function find_suitable_problem gives the list of problem every scikit-learn is tested on. The main ones are the following:

  • b-cm: binary classification,

  • m-cl: multi-class classification,

  • reg: regression,

  • cluster: clutering,

  • outlier: outlier detection,

  • num-tr: no label, only numerical features

The full list is given by find_suitable_problem. Next table tracks what is available, what is working and some indication about the cause of the error if it does not work.

<<<

from logging import getLogger
from pyquickhelper.loghelper import noLOG
from pandas import DataFrame
from pyquickhelper.pandashelper import df2rst
from sklearn.exceptions import ConvergenceWarning
from sklearn.utils._testing import ignore_warnings
from mlprodict.onnxrt.validate import enumerate_validated_operator_opsets, summary_report


@ignore_warnings(category=(UserWarning, ConvergenceWarning, RuntimeWarning, FutureWarning))
def build_table():
    logger = getLogger('skl2onnx')
    logger.disabled = True
    rows = list(enumerate_validated_operator_opsets(
        0, debug=None, fLOG=noLOG,
        models=['LinearRegression', 'LogisticRegression'],
        benchmark=True))
    df = DataFrame(rows)
    piv = summary_report(df)

    if "ERROR-msg" in piv.columns:
        def shorten(text):
            text = str(text)
            if len(text) > 75:
                text = text[:75] + "..."
            return text

        piv["ERROR-msg"] = piv["ERROR-msg"].apply(shorten)

    print(df2rst(piv, number_format=2,
                 replacements={'nan': '', 'ERR: 4convert': ''}))


build_table()

>>>

name

problem

scenario

optim

method_name

output_index

conv_options

inst

n_features

runtime

skl_version

skl_nop

skl_ncoef

skl_nlin

onx_size

onx_nnodes

onx_ninits

onx_producer_name

onx_producer_version

onx_ai.onnx.ml

onx_size_optim

onx_nnodes_optim

onx_ninits_optim

onx_op_Reshape

onx_op_Cast

onx_op_ZipMap

opset15

RT/SKL-N=1

N=10

N=100

N=1000

N=10000

RT/SKL-N=1-min

RT/SKL-N=1-max

N=10-min

N=10-max

N=100-min

N=100-max

N=1000-min

N=1000-max

N=10000-min

N=10000-max

LinearRegression

b-reg

default

predict

0

{}

null

4

python

1.0.2

1

4

1

259

1

0

skl2onnx

1.11.1

1

259

1

0

-1

-1

-1

OK 15/1

0.41

0.4

0.41

0.44

0.52

0.4

0.43

0.38

0.43

0.38

0.44

0.39

0.48

0.42

0.63

LinearRegression

m-reg

default

predict

0

{}

null

4

python

1.0.2

1

2

1

301

1

0

skl2onnx

1.11.1

1

301

1

0

-1

-1

-1

OK 15/1

0.4

0.42

0.43

0.53

0.8

0.38

0.43

0.39

0.45

0.4

0.47

0.46

0.57

0.77

0.83

LinearRegression

~b-reg-64

default

predict

0

{}

null

4

python

1.0.2

1

4

1

365

3

3

skl2onnx

1.11.1

-1

365

3

3

1

-1

-1

OK 13/

1.1

1.1

1.1

1.1

0.98

1.1

1.1

1.1

1.1

1

1.2

0.97

1.2

0.83

1.2

LinearRegression

~m-reg-64

default

predict

0

{}

null

4

python

1.0.2

1

2

1

405

3

3

skl2onnx

1.11.1

-1

405

3

3

1

-1

-1

OK 13/

1.1

1.1

1.1

1

1

1

1.1

1

1.1

0.98

1.2

0.97

1.1

0.94

1.1

LogisticRegression

b-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

652

4

0

skl2onnx

1.11.1

1

652

4

0

-1

1

1

OK 9/1

0.75

0.8

0.83

1.1

1.6

0.7

0.79

0.75

0.83

0.77

0.89

1

1.2

1.6

1.7

LogisticRegression

b-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

496

2

0

skl2onnx

1.11.1

1

496

2

0

-1

-1

-1

OK 15/1

0.6

0.64

0.68

0.99

1.6

0.55

0.63

0.6

0.69

0.63

0.73

0.93

1

1.6

1.6

LogisticRegression

b-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

652

4

0

skl2onnx

1.11.1

1

652

4

0

-1

1

1

OK 9/1

LogisticRegression

b-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

496

2

0

skl2onnx

1.11.1

1

496

2

0

-1

-1

-1

OK 15/1

LogisticRegression

m-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

3

1

681

4

0

skl2onnx

1.11.1

1

681

4

0

-1

1

1

OK 9/1

0.81

0.88

0.73

0.63

0.71

0.69

0.93

0.65

0.99

0.61

0.96

0.6

0.68

0.7

0.72

LogisticRegression

m-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

3

1

523

2

0

skl2onnx

1.11.1

1

523

2

0

-1

-1

-1

OK 15/1

0.65

0.67

0.58

0.57

0.68

0.54

0.73

0.52

0.83

0.5

0.73

0.54

0.61

0.67

0.69

LogisticRegression

m-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

3

1

681

4

0

skl2onnx

1.11.1

1

681

4

0

-1

1

1

OK 9/1

LogisticRegression

m-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

3

1

523

2

0

skl2onnx

1.11.1

1

523

2

0

-1

-1

-1

OK 15/1

LogisticRegression

~b-cl-64

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

1161

13

5

skl2onnx

1.11.1

1

1161

13

5

1

3

1

OK 13/1

2.5

2.6

2.8

3.3

5.1

2.2

2.7

2.3

2.9

2.3

3.1

2.9

3.5

5

5.1

LogisticRegression

~b-cl-64

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

1004

11

5

skl2onnx

1.11.1

1

1004

11

5

1

2

-1

OK 13/1

2.2

2.4

2.5

3.1

5.1

2

2.6

2.1

2.8

2.1

3

2.8

3.3

5

5.2

LogisticRegression

~b-cl-64

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

1161

13

5

skl2onnx

1.11.1

1

1161

13

5

1

3

1

OK 13/1

LogisticRegression

~b-cl-64

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

1004

11

5

skl2onnx

1.11.1

1

1004

11

5

1

2

-1

OK 13/1

LogisticRegression

~b-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

1

1

399

1

0

skl2onnx

1.11.1

1

399

1

0

-1

-1

-1

OK 15/1

0.49

0.51

0.52

0.72

0.49

0.47

0.53

0.43

0.55

0.47

0.58

0.68

0.79

0.35

0.76

LogisticRegression

~m-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

python

1.0.2

1

3

1

426

1

0

skl2onnx

1.11.1

1

426

1

0

-1

-1

-1

OK 15/1

0.5

0.51

0.53

0.64

0.83

0.49

0.52

0.48

0.54

0.47

0.61

0.59

0.69

0.8

0.86

Full results are available at l-onnx-bench-python.

python_compiled#

This runtime is almost the same as the previous one but it creates and compiles a dedicated function to call every node of the graph. Graph operations are faster but it is not possible to look into every intermediate node anymore.

<<<

import numpy
from sklearn.ensemble import AdaBoostRegressor
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from mlprodict.onnxrt import OnnxInference
from mlprodict.onnx_conv import to_onnx

iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, _ = train_test_split(X, y)
clr = AdaBoostRegressor(n_estimators=5)
clr.fit(X_train, y_train)
model_def = to_onnx(clr, X_train.astype(numpy.float32),
                    target_opset=12)
oinf = OnnxInference(model_def, runtime="python_compiled")
print(oinf)

>>>

    OnnxInference(...)
        def compiled_run(dict_inputs, yield_ops=None):
            if yield_ops is not None:
                raise NotImplementedError('yields_ops should be None.')
            # init: axis_name (axis_name)
            # init: estimators_weights (estimators_weights)
            # init: half_scalar (half_scalar)
            # init: k_value (k_value)
            # init: last_index (last_index)
            # init: negate (negate)
            # init: shape_tensor (shape_tensor)
            # inputs
            X = dict_inputs['X']
            (est_label_3, ) = n0_treeensembleregressor_1(X)
            (est_label_0, ) = n1_treeensembleregressor_1(X)
            (est_label_2, ) = n2_treeensembleregressor_1(X)
            (est_label_4, ) = n3_treeensembleregressor_1(X)
            (est_label_1, ) = n4_treeensembleregressor_1(X)
            (concatenated_labels, ) = n5_concat(est_label_0, est_label_1, est_label_2, est_label_3, est_label_4)
            (negated_labels, ) = n6_mul(concatenated_labels, negate)
            (sorted_values, sorted_indices, ) = n7_topk_11(negated_labels, k_value)
            (array_feat_extractor_output, ) = n8_arrayfeatureextractor(estimators_weights, sorted_indices)
            (reshaped_weights, ) = n9_reshape_5(array_feat_extractor_output, shape_tensor)
            (weights_cdf, ) = n10_cumsum(reshaped_weights, axis_name)
            (median_value, ) = n11_arrayfeatureextractor(weights_cdf, last_index)
            (comp_value, ) = n12_mul(median_value, half_scalar)
            (median_or_above, ) = n13_less(weights_cdf, comp_value)
            (cast_result, ) = n14_cast(median_or_above)
            (median_idx, ) = n15_argmin_12(cast_result)
            (median_estimators, ) = n16_gatherelements(sorted_indices, median_idx)
            (variable, ) = n17_gatherelements(concatenated_labels, median_estimators)
            return {
                'variable': variable,
            }

onnxruntime1#

onnxruntime loads the ONNX data in a single session and calls it onle once to compute the predictions. We create a table similar to Python Runtime = ‘python’.

<<<

from logging import getLogger
from pyquickhelper.loghelper import noLOG
from pandas import DataFrame
from pyquickhelper.pandashelper import df2rst
from sklearn.exceptions import ConvergenceWarning
from sklearn.utils._testing import ignore_warnings
from mlprodict.onnxrt.validate import enumerate_validated_operator_opsets, summary_report


@ignore_warnings(category=(UserWarning, ConvergenceWarning, RuntimeWarning, FutureWarning))
def build_table():
    logger = getLogger('skl2onnx')
    logger.disabled = True
    rows = list(enumerate_validated_operator_opsets(
        0, debug=None, fLOG=noLOG, runtime='onnxruntime1',
        models=['LinearRegression', 'LogisticRegression'],
        benchmark=True))
    df = DataFrame(rows)
    piv = summary_report(df)

    if "ERROR-msg" in piv.columns:
        def shorten(text):
            text = str(text)
            if len(text) > 75:
                text = text[:75] + "..."
            return text

        piv["ERROR-msg"] = piv["ERROR-msg"].apply(shorten)

    print(df2rst(piv, number_format=2,
                 replacements={'nan': '', 'ERR: 4convert': ''}))


build_table()

>>>

name

problem

scenario

optim

method_name

output_index

conv_options

inst

n_features

runtime

skl_version

skl_nop

skl_ncoef

skl_nlin

onx_size

onx_nnodes

onx_ninits

onx_producer_name

onx_producer_version

onx_ai.onnx.ml

onx_size_optim

onx_nnodes_optim

onx_ninits_optim

onx_op_Reshape

onx_op_Cast

onx_op_ZipMap

opset15

ERROR-msg

RT/SKL-N=1

N=10

N=100

N=1000

N=10000

RT/SKL-N=1-min

RT/SKL-N=1-max

N=10-min

N=10-max

N=100-min

N=100-max

N=1000-min

N=1000-max

N=10000-min

N=10000-max

LinearRegression

b-reg

default

predict

0

{}

null

4

onnxruntime1

1.0.2

1

4

1

259

1

0

skl2onnx

1.11.1

1

259

1

0

-1

-1

-1

OK 15/1

0.8

0.79

0.81

0.91

1.9

0.68

0.92

0.66

1.1

0.73

0.88

0.85

1

1.7

2.1

LinearRegression

m-reg

default

predict

0

{}

null

4

onnxruntime1

1.0.2

1

2

1

301

1

0

skl2onnx

1.11.1

1

301

1

0

-1

-1

-1

OK 15/1

0.79

0.78

0.78

0.81

0.95

0.74

0.89

0.72

1.1

0.71

0.87

0.74

0.92

0.91

0.99

LinearRegression

~b-reg-64

default

predict

0

{}

null

4

onnxruntime1

1.0.2

1

4

1

365

3

3

skl2onnx

1.11.1

-1

365

3

3

1

-1

-1

OK 13/

1.7

1.6

1.7

2.1

5.1

1.5

2

1.5

2.2

1.5

1.8

2

2.2

4.7

5.7

LinearRegression

~m-reg-64

default

predict

0

{}

null

4

onnxruntime1

1.0.2

1

2

1

405

3

3

skl2onnx

1.11.1

-1

405

3

3

1

-1

-1

OK 13/

1.6

1.6

1.6

1.7

1.7

1.5

1.8

1.5

2.3

1.5

1.7

1.5

1.8

1.7

1.7

LogisticRegression

b-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

652

4

0

skl2onnx

1.11.1

1

652

4

0

-1

1

1

OK 9/1

1.2

1.2

1.5

5.1

11

1.1

1.4

1.1

1.2

1.3

1.7

4.9

5.4

11

11

LogisticRegression

b-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

496

2

0

skl2onnx

1.11.1

1

496

2

0

-1

-1

-1

OK 15/1

0.71

0.73

0.81

0.81

0.96

0.67

0.85

0.69

0.74

0.63

1.7

0.77

0.87

0.92

1

LogisticRegression

b-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

652

4

0

skl2onnx

1.11.1

1

652

4

0

-1

1

1

OK 9/1

LogisticRegression

b-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

496

2

0

skl2onnx

1.11.1

1

496

2

0

-1

-1

-1

OK 15/1

LogisticRegression

m-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

3

1

681

4

0

skl2onnx

1.11.1

1

681

4

0

-1

1

1

OK 9/1

1.2

1.2

1.6

3.7

4.7

1.2

1.5

1.2

1.3

1.5

1.7

3.6

3.9

4.7

4.7

LogisticRegression

m-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

3

1

523

2

0

skl2onnx

1.11.1

1

523

2

0

-1

-1

-1

OK 15/1

0.76

0.73

0.78

0.51

0.39

0.72

0.87

0.69

0.76

0.64

1.5

0.48

0.53

0.38

0.39

LogisticRegression

m-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

3

1

681

4

0

skl2onnx

1.11.1

1

681

4

0

-1

1

1

OK 9/1

LogisticRegression

m-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

3

1

523

2

0

skl2onnx

1.11.1

1

523

2

0

-1

-1

-1

OK 15/1

LogisticRegression

~b-cl-64

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

1161

13

5

skl2onnx

1.11.1

1

1161

13

5

1

3

1

ERR: 5ort_load

Unable to create InferenceSession due to ‘[ONNXRuntimeError] : 10 : INVALID…

LogisticRegression

~b-cl-64

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

1004

11

5

skl2onnx

1.11.1

1

1004

11

5

1

2

-1

OK 13/1

2.9

2.9

2.9

3.1

3.2

2.6

3.3

2.7

4.4

2.7

3

3

3.3

2.9

3.5

LogisticRegression

~b-cl-64

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

1161

13

5

skl2onnx

1.11.1

1

1161

13

5

1

3

1

ERR: 5ort_load

Unable to create InferenceSession due to ‘[ONNXRuntimeError] : 10 : INVALID…

LogisticRegression

~b-cl-64

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

1004

11

5

skl2onnx

1.11.1

1

1004

11

5

1

2

-1

OK 13/1

LogisticRegression

~b-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

1

1

399

1

0

skl2onnx

1.11.1

1

399

1

0

-1

-1

-1

OK 15/1

0.74

0.75

0.76

0.95

0.66

0.71

0.8

0.71

0.95

0.71

0.82

0.87

1

0.52

0.86

LogisticRegression

~m-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime1

1.0.2

1

3

1

426

1

0

skl2onnx

1.11.1

1

426

1

0

-1

-1

-1

OK 15/1

0.75

0.74

0.76

0.78

0.73

0.72

0.81

0.7

1

0.71

0.81

0.72

0.83

0.68

0.79

Full results are available at Availability of scikit-learn model for runtime onnxruntime1.

Profiling

onnxruntime has a tool to verify and test ONNX graphs: onnxruntime_perf_test. It measures the execution time for a graph. It can also be used to profile the code of onnxruntime. On Windows (but it also works on Linux):

  • Creates an onnx graph and its inputs as protobug. Places them in a folder like explained in the page onnxruntime_perf_test.

  • Clone and compile onnxruntime using release with debug information, python tools/ci_build/build.py --build_dir build_dir --config RelWithDebInfo --build_wheel --use_openmp --use_mklml --numpy_version= --skip_onnx_tests

  • Open Visual Studio and modifies the command line of onnxruntime_perf_test.exe as: -s -t 30 <model.onnx> <anything.txt>. Select it as startup project.

  • Starts the profiling.

onnxruntime2: independent onnxruntime for every node#

This runtime does not load the ONNX data in a single session but instead calls onnxruntime for each node independently. This was developped mostly to facilitate the implementation of converters from scikit-learn object to ONNX. We create a table similar to Python Runtime = ‘python’.

<<<

from logging import getLogger
from pyquickhelper.loghelper import noLOG
from pandas import DataFrame
from pyquickhelper.pandashelper import df2rst
from sklearn.exceptions import ConvergenceWarning
from sklearn.utils._testing import ignore_warnings
from mlprodict.onnxrt.validate import enumerate_validated_operator_opsets, summary_report


@ignore_warnings(category=(UserWarning, ConvergenceWarning, RuntimeWarning, FutureWarning))
def build_table():
    logger = getLogger('skl2onnx')
    logger.disabled = True
    rows = list(enumerate_validated_operator_opsets(
        0, debug=None, fLOG=noLOG, runtime='onnxruntime2',
        models=['LinearRegression', 'LogisticRegression'],
        benchmark=True))
    df = DataFrame(rows)
    piv = summary_report(df)

    if "ERROR-msg" in piv.columns:
        def shorten(text):
            text = str(text)
            if len(text) > 75:
                text = text[:75] + "..."
            return text

        piv["ERROR-msg"] = piv["ERROR-msg"].apply(shorten)

    print(df2rst(piv, number_format=2,
                 replacements={'nan': '', 'ERR: 4convert': ''}))


build_table()

>>>

name

problem

scenario

optim

method_name

output_index

conv_options

inst

n_features

runtime

skl_version

skl_nop

skl_ncoef

skl_nlin

onx_size

onx_nnodes

onx_ninits

onx_producer_name

onx_producer_version

onx_ai.onnx.ml

onx_size_optim

onx_nnodes_optim

onx_ninits_optim

onx_op_Reshape

onx_op_Cast

onx_op_ZipMap

opset15

ERROR-msg

RT/SKL-N=1

N=10

N=100

N=1000

N=10000

RT/SKL-N=1-min

RT/SKL-N=1-max

N=10-min

N=10-max

N=100-min

N=100-max

N=1000-min

N=1000-max

N=10000-min

N=10000-max

LinearRegression

b-reg

default

predict

0

{}

null

4

onnxruntime2

1.0.2

1

4

1

259

1

0

skl2onnx

1.11.1

1

259

1

0

-1

-1

-1

OK 15/1

0.6

0.59

0.61

0.73

1.8

0.56

0.63

0.53

0.65

0.56

0.67

0.68

0.8

1.6

2.1

LinearRegression

m-reg

default

predict

0

{}

null

4

onnxruntime2

1.0.2

1

2

1

301

1

0

skl2onnx

1.11.1

1

301

1

0

-1

-1

-1

OK 15/1

0.59

0.58

0.59

0.67

3

0.56

0.62

0.54

0.61

0.54

0.66

0.59

0.73

0.65

5.6

LinearRegression

~b-reg-64

default

predict

0

{}

null

4

onnxruntime2

1.0.2

1

4

1

365

3

3

skl2onnx

1.11.1

-1

365

3

3

1

-1

-1

OK 13/

1.7

1.7

1.8

2.2

5.4

1.5

1.8

1.6

1.8

1.7

1.9

2.1

2.5

5.2

5.7

LinearRegression

~m-reg-64

default

predict

0

{}

null

4

onnxruntime2

1.0.2

1

2

1

405

3

3

skl2onnx

1.11.1

-1

405

3

3

1

-1

-1

OK 13/

1.7

1.7

1.7

1.8

1.9

1.6

1.7

1.6

1.7

1.6

1.9

1.7

2

1.9

1.9

LogisticRegression

b-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

652

4

0

skl2onnx

1.11.1

1

652

4

0

-1

1

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

b-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

496

2

0

skl2onnx

1.11.1

1

496

2

0

-1

-1

-1

OK 15/1

0.65

0.66

0.66

0.74

0.92

0.62

0.66

0.63

0.69

0.6

0.71

0.68

0.79

0.89

0.95

LogisticRegression

b-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

652

4

0

skl2onnx

1.11.1

1

652

4

0

-1

1

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

b-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

496

2

0

skl2onnx

1.11.1

1

496

2

0

-1

-1

-1

OK 15/1

LogisticRegression

m-cl

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

3

1

681

4

0

skl2onnx

1.11.1

1

681

4

0

-1

1

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

m-cl

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

3

1

523

2

0

skl2onnx

1.11.1

1

523

2

0

-1

-1

-1

OK 15/1

0.69

0.66

0.63

0.47

0.39

0.67

0.71

0.61

0.7

0.57

0.67

0.45

0.49

0.39

0.4

LogisticRegression

m-cl

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

3

1

681

4

0

skl2onnx

1.11.1

1

681

4

0

-1

1

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

m-cl

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

3

1

523

2

0

skl2onnx

1.11.1

1

523

2

0

-1

-1

-1

OK 15/1

LogisticRegression

~b-cl-64

liblinear

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

1161

13

5

skl2onnx

1.11.1

1

1161

13

5

1

3

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

~b-cl-64

liblinear

{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

1004

11

5

skl2onnx

1.11.1

1

1004

11

5

1

2

-1

OK 13/1

3.5

3.5

3.6

3.7

1.3

3.4

3.5

3.4

3.6

3.3

3.7

3.4

3.9

1.1

1.5

LogisticRegression

~b-cl-64

liblinear

onnx

predict_proba

1

{}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

1161

13

5

skl2onnx

1.11.1

1

1161

13

5

1

3

1

ERR: 5ort_load

Unable to load node ‘ZipMap’ (output type was guessed) inputs=[(‘probabilit…

LogisticRegression

~b-cl-64

liblinear

onnx/{‘zipmap’: False}

predict_proba

1

{“LogisticRegression”: {“zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

1004

11

5

skl2onnx

1.11.1

1

1004

11

5

1

2

-1

OK 13/1

LogisticRegression

~b-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

1

1

399

1

0

skl2onnx

1.11.1

1

399

1

0

-1

-1

-1

OK 15/1

0.58

0.58

0.6

0.79

1.6

0.56

0.6

0.55

0.63

0.56

0.66

0.7

0.91

1.5

1.7

LogisticRegression

~m-cl-dec

liblinear-dec

{‘raw_scores’: True, ‘zipmap’: False}

decision_function

1

{“LogisticRegression”: {“raw_scores”: true, “zipmap”: false}}

{“random_state”: 42, “solver”: “liblinear”}

4

onnxruntime2

1.0.2

1

3

1

426

1

0

skl2onnx

1.11.1

1

426

1

0

-1

-1

-1

OK 15/1

0.59

0.57

0.58

0.67

0.81

0.58

0.61

0.54

0.62

0.53

0.63

0.62

0.73

0.78

0.85

Full results are available at Availability of scikit-learn model for runtime onnxruntime1.