How To Configure System¶
A system is the environment (OS, hardware spec, device platform, supported EP) that a Pass is run in or a Model is evaluated on. It can thus be the host of a Pass or the target of an evaluation. This document describes how to configure the different types of Systems.
Local System¶
{
"type": "LocalSystem",
"config": {
"accelerators": ["cpu"]
}
}
from olive.systems.local import LocalSystem
from olive.system.common import Device
local_system = LocalSystem(
accelerators=[Device.CPU]
)
Please refer to LocalTargetUserConfig for more details on the config options.
AzureML System¶
Prerequisites¶
azureml extra dependencies installed.
pip install olive-ai[azureml]
or
pip install azure-ai-ml azure-identity
2. AzureML Workspace with necessary compute created. Refer to this for more details. Download the workspace config json.
System Configuration¶
{
"type": "AzureML",
"config": {
"aml_compute": "cpu-cluster",
"aml_docker_config": {
"base_image": "mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
"conda_file_path": "conda.yaml"
}
}
}
from olive.systems.azureml import AzureMLDockerConfig, AzureMLSystem
docker_config = AzureMLDockerConfig(
base_image="mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
conda_file_path="conda.yaml",
)
aml_system = AzureMLSystem(
aml_compute="cpu-cluster",
aml_docker_config=docker_config
)
If you provide a aml_docker_config
, Olive will create a new Azure ML Environment using the aml_docker_config
configuration.
Alternatively, you can provide an existing Azure ML Environment using aml_environment_config
:
{
"type": "AzureML",
"config": {
"aml_compute": "cpu-cluster",
"aml_environment_config": {
"name": "myenv",
"version": "1"
}
}
}
from olive.systems.azureml import AzureMLDockerConfig, AzureMLSystem
aml_environment_config = AzureMLEnvironmentConfig(
name="myenv",
version="1",
)
aml_system = AzureMLSystem(
aml_compute="cpu-cluster",
aml_environment_config=aml_environment_config
)
Olive can also manage the environment by setting olive_managed_env = True
{
"type": "AzureML",
"config": {
"aml_compute": "cpu-cluster",
"accelerators": ["cpu"],
"olive_managed_env": true,
}
}
from olive.systems.azureml import AzureMLSystem
aml_system = AzureMLSystem(
aml_compute="cpu-cluster",
accelerators=["cpu"],
olive_managed_env=True,
)
Please refer to this example
for "conda.yaml"
.
Important
The AzureML environment must have olive-ai
installed if olive_managed_env = False
!
Please refer to AzureMLTargetUserConfig for more details on the config options.
AzureML Readymade Systems¶
- There are some readymade systems available for AzureML. These systems are pre-configured with the necessary.
{ "type": "AzureNDV2System", "config": { "aml_compute": "gpu-cluster", "aml_docker_config": { "base_image": "mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04", "conda_file_path": "conda.yaml" } } }
Please refer to System Alias for the list of supported AzureML readymade systems.
Docker System¶
Prerequisites¶
Docker Engine installed on the host machine.
docker extra dependencies installed.
pip install olive-ai[docker]
or
pip install docker
System Configuration¶
{
"type": "Docker",
"config": {
"local_docker_config": {
"image_name": "olive",
"build_context_path": "docker",
"dockerfile": "Dockerfile"
}
}
}
from olive.systems.docker import DockerSystem, LocalDockerConfig
local_docker_config = LocalDockerConfig(
image_name="olive",
build_context_path="docker",
dockerfile="Dockerfile",
)
docker_system = DockerSystem(local_docker_config=local_docker_config)
Olive can manage the environment by setting olive_managed_env = True
{
"type": "Docker",
"config": {
"accelerators": ["cpu"],
"olive_managed_env": true,
"requirements_file": "mnist_requirements.txt"
}
}
}
from olive.systems.docker import DockerSystem
docker_system = DockerSystem(
accelerators=["cpu"],
olive_managed_env=True,
requirements_file="mnist_requirements.txt",
)
Please refer to this example
for "docker"
and "Dockerfile"
.
Important
The docker container must have olive-ai
installed!
Please refer to DockerTargetUserConfig for more details on the config options.
Python Environment System¶
{
"type": "PythonEnvironment",
"config": {
"python_environment_path": "/home/user/.virtualenvs/myenv/bin",
"accelerators": ["cpu"]
}
}
from olive.systems.python_environment import PythonEnvironmentSystem
from olive.system.common import Device
python_environment_system = PythonEnvironmentSystem(
python_environment_path = "/home/user/.virtualenvs/myenv/bin",
device = Device.CPU
)
Olive can also manage the environment by setting olive_managed_env = True
. This feature works best when used from Conda.
{
"type": "PythonEnvironment",
"config": {
"accelerators": ["cpu"],
"olive_managed_env": true,
}
}
from olive.systems.python_environment import PythonEnvironmentSystem
from olive.system.common import Device
python_environment_system = PythonEnvironmentSystem(
olive_managed_env = True,
device = Device.CPU
)
Important
The python environment must have olive-ai
installed if olive_managed_env = False
!
Please refer to PythonEnvironmentTargetUserConfig for more details on the config options.
Ort Environment System¶
{
"type": "IsolatedORT",
"config": {
"python_environment_path": "/home/user/.virtualenvs/myenv/bin",
"accelerators": ["cpu"]
}
}
from olive.systems.ort_evironment import IsolatedORTSystem
from olive.system.common import Device
python_environment_system = IsolatedORTSystem(
python_environment_path = "/home/user/.virtualenvs/myenv/bin",
device = Device.CPU
)
IsolatedORTSystem does not support olive_managed_env and can only be used to evaluate ONNX models.
Important
The python environment must have the relevant ONNX runtime package installed!
Please refer to IsolatedORTTargetUserConfig for more details on the config options.