RSS onnx - 1/1 Blog onnx (8) runtime (3)


onnx - 1/1#

Xop, easy to create onnx graph#

2022-02-27

onnx package has a very verbose API to create ONNX graph. Could you imagine a user to directly write the syntax tree of a program instead of some python code? Creating a ONNX graph is very similar to that task except ONNX language is more simple than python.

post

Decompose einsum into numpy operators#

2021-08-11

Notebook Einsum decomposition what function numpy.einsum does and how it can be decomposed into a series of basic operations, all available in ONNX. That’s the purpose of function Function decompose_einsum_equation. With function export2numpy, it is possible to convert back this ONNX graph into a series of numpy operations.

post

onnxruntime shape [] != None#

2021-08-10

None is the undefined shape, [] is an empty shape. And when shapes do not fit the results, the outputs can be suprising. The following example shows what onnxruntime produces for the same graph except input and output shapes when defined as None and [].

post

Operator CDist#

2019-09-16

Notebooks Pairwise distances with ONNX (pdist) shows how much slower an ONNX implementation of function cdist, from 3 to 10 times slower. One way to optimize the converted model is to create dedicated operator such as one for function cdist. Tutorial Converters with options explains how to tell function to_onnx to use the custom operator CDist.

post

Float, double with ONNX#

2019-08-23

Replicating what a library does, scikit-learn for example, is different from implementing a function defined in a paper. Every trick needs to be replicated. scikit-learn trees implement a prediction function which takes float features and compares them to double thresholds. Knowning the ONNX assumes that comparison only happens numbers of the same type, you end up with discrepencies.

post

ONNX updates#

2019-08-02

The python runtime is now almost complete for all the supported numerical operator implemented in sklearn-onnx. A couple of notebooks introduces a couple of way to investigates issues, to benchmark ONNX models with onnxruntime or python runtime, to check the differences between the same model. It also extend ONNX with operators not in the specification to experiment some assumptions and check it is more efficient. Notebook Precision loss due to float32 conversion with ONNX introduces a way to guess the margins introduced by the conversion from double to single. There also exists a function to convert numpy function into ONNX (see Create custom ONNX graphs with AST). Its coverage is probably low but it will improve.

post

ONNX, runtime#

2019-06-25

Somebody asked me one day if it would be difficult to write a runtime for ONNX in Rust. I just replied that it should not take that long but it would require to implement a way to goes through the nodes of the ONNX graph and to have an implementation for every ONNX Operators

post

ONNX, runtime, converters#

2019-06-15

I have been recently working on sklearn-onnx to write converter from scikit-learn operators to ONNX serialization format. I was talking about that a month ago and somebody asked me if there was a runtime implemented in RUST. Not that I know of but I said it would not be too complex to implement one.

post


RSS onnx - 1/1 2020-11 (1) 2021-05 (2) 2021-07 (2) 2021-08 (3) 2022-02 (1)