Transform your Windows application with the power of artificial intelligence.

DirectML, a powerful machine learning API developed by Microsoft, is fast, versatile, and works seamlessly across a wide range of hardware platforms. With support from every DirectX 12-capable GPU and soon across NPUs, developers can use DirectML to power AI experiences on almost every Microsoft device!

DirectML is already pre-installed on many Windows 10+ devices and is also available as a NuGet package. With the ability to access DML from a range of supported languages (C++, Python, C#) and integration with ONNX for converting, tuning, and integrating machine learning models, it’s easy to get started

Organizations and products using DirectML

Get started with these three steps:

1. Convert

The ONNX format enables you to leverage ONNX Runtime with DirectML, which provides cross-platform and hardware capabilities.

To convert your model to the ONNX format, you can utilize ONNXMLTools or Olive.

2. Optimize

Once you have an .onnx model, leverage Olive powered by DirectML to optimize your model. You'll see dramatic performance improvements that you can deploy across the Windows hardware ecosystem.

3. Integrate

Once your model is ready, it’s time to bring hardware-accelerated inferencing to your app with ONNX Runtime and DirectML.

We built some samples to show how you can use DirectML in a range of languages:

Helpful Links

For some developers, this guide is just the beginning. For documentation, more advanced samples and other helpful content see our Links page.

Connect with us:

YouTube logo Github logo