{
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "\n# Generating a ModelProto\n\nThis example demonstrates the use of *onnxscript* to define an ONNX model.\n*onnxscript* behaves like a compiler. It converts a script into an ONNX model.\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "First, we define the implementation of a square-loss function in onnxscript.\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "import numpy as np\nimport onnx\nfrom onnxruntime import InferenceSession\n\nfrom onnxscript import FLOAT, script\nfrom onnxscript import opset15 as op\n\n\n@script()\ndef square_loss(X: FLOAT[\"N\", 1], Y: FLOAT[\"N\", 1]) -> FLOAT[1, 1]:  # noqa: F821\n    diff = X - Y\n    return op.ReduceSum(diff * diff, keepdims=1)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "We can convert it to a model (an ONNX *ModelProto*) as follows:\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "model = square_loss.to_model_proto()"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "Let's see what the generated model looks like.\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "print(onnx.printer.to_text(model))"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "We can run shape-inference and type-check the model using the standard ONNX API.\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "model = onnx.shape_inference.infer_shapes(model)\nonnx.checker.check_model(model)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {},
      "source": [
        "And finally, we can use *onnxruntime* to compute the outputs\nbased on this model, using the standard onnxruntime API.\n\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "collapsed": false
      },
      "outputs": [],
      "source": [
        "sess = InferenceSession(model.SerializeToString(), providers=(\"CPUExecutionProvider\",))\n\nX = np.array([[0, 1, 2]], dtype=np.float32).T\nY = np.array([[0.1, 1.2, 2.3]], dtype=np.float32).T\n\ngot = sess.run(None, {\"X\": X, \"Y\": Y})\nexpected = ((X - Y) ** 2).sum()\n\nprint(expected, got)"
      ]
    }
  ],
  "metadata": {
    "kernelspec": {
      "display_name": "Python 3",
      "language": "python",
      "name": "python3"
    },
    "language_info": {
      "codemirror_mode": {
        "name": "ipython",
        "version": 3
      },
      "file_extension": ".py",
      "mimetype": "text/x-python",
      "name": "python",
      "nbconvert_exporter": "python",
      "pygments_lexer": "ipython3",
      "version": "3.10.16"
    }
  },
  "nbformat": 4,
  "nbformat_minor": 0
}