How To Configure System

A system is the environment (OS, hardware spec, device platform, supported EP) that a Pass is run in or a Model is evaluated on. It can thus be the host of a Pass or the target of an evaluation. This document describes how to configure the different types of Systems.

Accelerator Configuration

For each host or target, it could specify multiple accelerators associated with it. Each accelerator could have the following attributes.

  • device: The device type of the accelerator. It could be “cpu”, “gpu”, “npu”, etc. Please refer to the API documentation for the full list of supported devices.

  • execution_providers: The execution provider list that are supported by the accelerator. For e.g. [“CUDAExecutionProvider”, “CPUExecutionProvider”].

Note:

  • The accelerators for local system or python system is optional. If not provided, Olive will get the available execution providers installed in current local machine and infer its device.

  • For local system or python system, either device or execution_providers is optional but not both if the accelerators are specified. If device or execution_providers is not provided, Olive will infer the device or execution_providers if possible.

  • For docker system and AzureML system, both device and execution_providers are mandatory. Otherwise, Olive will raise an error.

Local System

{
    "type": "LocalSystem",
    "config": {
        "accelerators": [{"device": "cpu"}]
    }
}

Please refer to LocalTargetUserConfig for more details on the config options.

AzureML System

Prerequisites

  1. azureml extra dependencies installed.

    pip install olive-ai[azureml]
    

    or

    pip install azure-ai-ml azure-identity
    

2. AzureML Workspace with necessary compute created. Refer to this for more details. Download the workspace config json.

System Configuration

{
    "type": "AzureML",
    "config": {
        "aml_compute": "cpu-cluster",
        "aml_docker_config": {
            "base_image": "mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
            "conda_file_path": "conda.yaml"
        }
    }
}

If you provide a aml_docker_config, Olive will create a new Azure ML Environment using the aml_docker_config configuration. Alternatively, you can provide an existing Azure ML Environment using aml_environment_config:

{
    "type": "AzureML",
    "config": {
        "aml_compute": "cpu-cluster",
        "aml_environment_config": {
            "name": "myenv",
            "version": "1"
        }
    }
}

Olive can also manage the environment by setting olive_managed_env = True

{
    "type": "AzureML",
    "config": {
        "aml_compute": "cpu-cluster",
        "accelerators": [
            {
                "device": "cpu",
                "execution_providers": [
                    "CPUExecutionProvider",
                    "OpenVINOExecutionProvider"
                ]
            }
        ],
        "olive_managed_env": true,
    }
}

Please refer to this example for "conda.yaml".

Important

The AzureML environment must have olive-ai installed if olive_managed_env = False!

Please refer to AzureMLTargetUserConfig for more details on the config options.

AzureML Readymade Systems

There are some readymade systems available for AzureML. These systems are pre-configured with the necessary.
{
    "type": "AzureNDV2System",
    "config": {
        "aml_compute": "gpu-cluster",
        "aml_docker_config": {
            "base_image": "mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
            "conda_file_path": "conda.yaml"
        }
    }
}

Please refer to System Alias for the list of supported AzureML readymade systems.

Docker System

Prerequisites

  1. Docker Engine installed on the host machine.

  2. docker extra dependencies installed.

    pip install olive-ai[docker]
    

    or

    pip install docker
    

System Configuration

{
    "type": "Docker",
    "config": {
        "local_docker_config": {
            "image_name": "olive",
            "build_context_path": "docker",
            "dockerfile": "Dockerfile"
        }
    }
}

Olive can manage the environment by setting olive_managed_env = True

{
    "type": "Docker",
    "config": {
        "accelerators": [
            {
                "device": "cpu",
                "execution_providers": [
                    "CPUExecutionProvider",
                    "OpenVINOExecutionProvider"
                ]
            }
        ],
        "olive_managed_env": true,
        "requirements_file": "mnist_requirements.txt"
        }
    }
}

Please refer to this example for "docker" and "Dockerfile".

Important

The docker container must have olive-ai installed!

Please refer to DockerTargetUserConfig for more details on the config options.

Python Environment System

{
    "type": "PythonEnvironment",
    "config": {
        "python_environment_path": "/home/user/.virtualenvs/myenv/bin",
        "accelerators": [
            {
                "device": "cpu",
                "execution_providers": [
                    "CPUExecutionProvider",
                    "OpenVINOExecutionProvider"
                ]
            }
        ]
    }
}

Olive can also manage the environment by setting olive_managed_env = True. This feature works best when used from Conda.

{
    "type": "PythonEnvironment",
    "config": {
        "accelerators": [{"device": "cpu"}]
        "olive_managed_env": true,
    }
}

Important

The python environment must have olive-ai installed if olive_managed_env = False!

Please refer to PythonEnvironmentTargetUserConfig for more details on the config options.

Isolated ORT System

{
    "type": "IsolatedORT",
    "config": {
        "python_environment_path": "/home/user/.virtualenvs/myenv/bin",
        "accelerators": [{"device": "cpu"}]
    }
}

IsolatedORTSystem does not support olive_managed_env and can only be used to evaluate ONNX models.

Important

The python environment must have the relevant ONNX runtime package installed!

Please refer to IsolatedORTTargetUserConfig for more details on the config options.