The Xentara ONNX Engine v2.0
User Manual
Installation

Installation of the Plugin

The Xentara plugin containing the ONNX engine skill is installed like any other Xentara plugin. See Installation for Details.

Installation of the ONNX Runtime

See also
Getting Started at https://onnxruntime.ai
Install ONNX Runtime (ORT) in the ONNX Runtime documentation at https://onnxruntime.ai

Windows

Under Windows, the installer for the Xentara ONNX plugin includes an option to install the DirectML variant of the ONNX Runtime DLL. The installed DLL belongs to the official ONNX 1.23.0 DirectML NuGet package Microsoft.ML.OnnxRuntime.DirectML, which can be downloaded from the Nuget Gallery.

If you want to install your own ONNX runtime DLL, or if you prefer to install the full package from https://github.com/microsoft/onnxruntime/releases, you can deselect the component “ONNX DirectML Runtime” in the installer, and install the ONNX runtime yourself.

If you install your own ONNX runtime DLL, it must be called “onnxruntime.dll”, and must support ONNX runtime API version 23.

Linux

Any Linux Installation

Under Linux, Xentara does not provide a custom installation package for the ONNX runtime, as the installation depends heavily on the hardware acceleration support required.

Installation instructions for different combinations of operating systems and hardware can be found in the Getting Started guide at https://onnxruntime.ai. Prebuilt binaries for are also available from the ONNX Runtime GitHub release page.

The installed ONNX runtime library must be called “libonnxruntime.so.1”, and must support ONNX runtime API version 21. Usually, “libonnxruntime.so.1” will not be the actual shared library file, but a symbolic link to the exact version of the library, for example “libonnxruntime.so.1.21.0”.

Debian 13 “trixie”

Debian 13 “trixie”, provides an installation package for the ONNX runtime version 1.21 called libonnxruntime1.21, which is in priciple suitable for use with the Xentara ONNX Engine. Please note, however, that this package only contains three execution providers:

Most notably, GPU execution providers like CUDA, TensorRT, ROCm, or MIGraphX, are missing. If you want to use GPU acceleration, or if you want to use any of the other execution providers the ONNX runtime supports, you must build the corresponding provider libraries yourself, or download a suitable pre-built binary from the ONNX Runtime GitHub release page.

You can install the ONNX runtime libraries under Debian 13 “trixie” using the following command:

sudo apt install libonnxruntime1.21
Note
The user installing the ONNX runtime needs the privileges necessary to execute the sudo(8) command. This is usually accomplished by adding the user to the sudo user group. This can only be done by the root user, or by another user that already has sudo access.

The libonnxruntime1.21 package does not include a symbolic link named “libonnxruntime.so.1”. You can manuall att a link using update-alternatives(1). To add the link, use the following command:

For Intel/AMD-based systems (x86_64):

sudo update-alternatives --install /usr/lib/x86_64-linux-gnu/libonnxruntime.so.1 libonnxruntime /usr/lib/x86_64-linux-gnu/libonnxruntime.so.1.21 1

For 64-Bit ARM systems (aarch64):

sudo update-alternatives --install /usr/lib/aarch64-linux-gnu/libonnxruntime.so.1 libonnxruntime /usr/lib/aarch64-linux-gnu/libonnxruntime.so.1.21 1