Toolkit Add-Ons

Take advantage of add-ons that extend the possibilities of the toolkit, and implement existing and new functionality now available in the core toolkit.

Estimate deep learning inference performance on supported devices.
Use this add-on to build, transform, and analyze datasets.
This cross-platform, command-line tool facilitates the transition between training and deployment environments, performs static model analysis, and adjusts deep learning models for optimal performance on end-point target devices.
Use this framework based on PyTorch for quantization-aware training.
Hugging Face* has a repository for the OpenVINO toolkit that provides resources and models aimed at optimizing deep learning models for inference on Intel hardware.
This scalable inference server is for serving models optimized with the Intel® Distribution of OpenVINO™ toolkit.