The most important thing this project provides appears to be the .onnx file format, which represents ANN models, pre-trained or not.
Deep learning frameworks can then output such .onnx files for interchangeability and serialization.
Some examples:
The cool thing is that ONNX can then run inference in an uniform manner on a variety of devices without installing the deep learning framework used for. It's a bit like having a kind of portable executable. Neat.
ONNX visualizer.
Figure 1.
Netron visualization of the activatedgeek/LeNet-5 ONNX output
.

Articles by others on the same topic (0)

There are currently no matching articles.