.. DO NOT EDIT.
.. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY.
.. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE:
.. "auto_examples/plot_profiling.py"
.. LINE NUMBERS ARE GIVEN BELOW.

.. only:: html

    .. note::
        :class: sphx-glr-download-link-note

        Click :ref:`here <sphx_glr_download_auto_examples_plot_profiling.py>`
        to download the full example code

.. rst-class:: sphx-glr-example-title

.. _sphx_glr_auto_examples_plot_profiling.py:


.. _l-example-profiling:

Profile the execution of a simple model
=======================================

*ONNX Runtime* can profile the execution of the model.
This example shows how to interpret the results.

.. GENERATED FROM PYTHON SOURCE LINES 14-32

.. code-block:: default

    import onnx
    import onnxruntime as rt
    import numpy
    from onnxruntime.datasets import get_example


    def change_ir_version(filename, ir_version=6):
        "onnxruntime==1.2.0 does not support opset <= 7 and ir_version > 6"
        with open(filename, "rb") as f:
            model = onnx.load(f)
        model.ir_version = 6
        if model.opset_import[0].version <= 7:
            model.opset_import[0].version = 11
        return model











.. GENERATED FROM PYTHON SOURCE LINES 33-34

Let's load a very simple model and compute some prediction.

.. GENERATED FROM PYTHON SOURCE LINES 34-45

.. code-block:: default


    example1 = get_example("mul_1.onnx")
    onnx_model = change_ir_version(example1)
    onnx_model_str = onnx_model.SerializeToString()
    sess = rt.InferenceSession(onnx_model_str, providers=rt.get_available_providers())
    input_name = sess.get_inputs()[0].name

    x = numpy.array([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]], dtype=numpy.float32)
    res = sess.run(None, {input_name: x})
    print(res)





.. rst-class:: sphx-glr-script-out

 Out:

 .. code-block:: none

    [array([[ 1.,  4.],
           [ 9., 16.],
           [25., 36.]], dtype=float32)]




.. GENERATED FROM PYTHON SOURCE LINES 46-48

We need to enable to profiling
before running the predictions.

.. GENERATED FROM PYTHON SOURCE LINES 48-60

.. code-block:: default


    options = rt.SessionOptions()
    options.enable_profiling = True
    sess_profile = rt.InferenceSession(onnx_model_str, options, providers=rt.get_available_providers())
    input_name = sess.get_inputs()[0].name

    x = numpy.array([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]], dtype=numpy.float32)

    sess.run(None, {input_name: x})
    prof_file = sess_profile.end_profiling()
    print(prof_file)





.. rst-class:: sphx-glr-script-out

 Out:

 .. code-block:: none

    onnxruntime_profile__2022-03-16_00-27-12.json




.. GENERATED FROM PYTHON SOURCE LINES 61-63

The results are stored un a file in JSON format.
Let's see what it contains.

.. GENERATED FROM PYTHON SOURCE LINES 63-71

.. code-block:: default

    import json
    with open(prof_file, "r") as f:
        sess_time = json.load(f)
    import pprint
    pprint.pprint(sess_time)


    




.. rst-class:: sphx-glr-script-out

 Out:

 .. code-block:: none

    [{'args': {},
      'cat': 'Session',
      'dur': 68,
      'name': 'model_loading_array',
      'ph': 'X',
      'pid': 3028,
      'tid': 3028,
      'ts': 1},
     {'args': {},
      'cat': 'Session',
      'dur': 194,
      'name': 'session_initialization',
      'ph': 'X',
      'pid': 3028,
      'tid': 3028,
      'ts': 86}]





.. rst-class:: sphx-glr-timing

   **Total running time of the script:** ( 0 minutes  0.006 seconds)


.. _sphx_glr_download_auto_examples_plot_profiling.py:


.. only :: html

 .. container:: sphx-glr-footer
    :class: sphx-glr-footer-example



  .. container:: sphx-glr-download sphx-glr-download-python

     :download:`Download Python source code: plot_profiling.py <plot_profiling.py>`



  .. container:: sphx-glr-download sphx-glr-download-jupyter

     :download:`Download Jupyter notebook: plot_profiling.ipynb <plot_profiling.ipynb>`


.. only:: html

 .. rst-class:: sphx-glr-signature

    `Gallery generated by Sphinx-Gallery <https://sphinx-gallery.github.io>`_