How To Configure System¶
A system is the environment (OS, hardware spec, device platform, supported EP) that a Pass is run in or a Model is evaluated on. It can thus be the host of a Pass or the target of an evaluation. This document describes how to configure the different types of Systems.
Local System¶
{
"type": "LocalSystem",
"config": {
"accelerators": ["cpu"]
}
}
from olive.systems.local import LocalSystem
from olive.system.common import Device
local_system = LocalSystem(
accelerators=[Device.CPU]
)
Please refer to LocalTargetUserConfig for more details on the config options.
AzureML System¶
Prerequisites¶
azureml extra dependencies installed.
pip install olive-ai[azureml]
or
pip install azure-ai-ml azure-identity
2. AzureML Workspace with necessary compute created. Refer to this for more details. Download the workspace config json.
System Configuration¶
{
"type": "AzureML",
"config": {
"aml_compute": "cpu-cluster",
"aml_docker_config": {
"base_image": "mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
"conda_file_path": "conda.yaml"
}
}
}
from olive.systems.azureml import AzureMLDockerConfig, AzureMLSystem
docker_config = AzureMLDockerConfig(
base_image="mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
conda_file_path="conda.yaml",
)
aml_system = AzureMLSystem(
aml_compute="cpu-cluster",
aml_docker_config={
"base_image": "mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04",
"conda_file_path": "conda.yaml"
}
)
Olive can also manage the environment by setting olive_managed_env = True
{
"type": "AzureML",
"config": {
"aml_compute": "cpu-cluster",
"accelerators": ["cpu"],
"olive_managed_env": true,
}
}
from olive.systems.azureml import AzureMLSystem
aml_system = AzureMLSystem(
aml_compute="cpu-cluster",
accelerators=["cpu"],
olive_managed_env=True,
)
Please refer to this example
for "conda.yaml"
.
Important
The AzureML environment must have olive-ai
installed if olive_managed_env = False
Please refer to AzureMLTargetUserConfig for more details on the config options.
AzureML Readymade Systems¶
- There are some readymade systems available for AzureML. These systems are pre-configured with the necessary.
{ "type": "AzureNDV2System", "config": { "aml_compute": "gpu-cluster", "aml_docker_config": { "base_image": "mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04", "conda_file_path": "conda.yaml" } } }
Please refer to System Alias for the list of supported AzureML readymade systems.
Docker System¶
Prerequisites¶
Docker Engine installed on the host machine.
docker extra dependencies installed.
pip install olive-ai[docker]
or
pip install docker
System Configuration¶
{
"type": "Docker",
"config": {
"local_docker_config": {
"image_name": "olive",
"build_context_path": "docker",
"dockerfile": "Dockerfile"
}
}
}
from olive.systems.docker import DockerSystem, LocalDockerConfig
local_docker_config = LocalDockerConfig(
image_name="olive",
build_context_path="docker",
dockerfile="Dockerfile",
)
docker_system = DockerSystem(local_docker_config=local_docker_config)
Olive can manage the environment by setting olive_managed_env = True
{
"type": "Docker",
"config": {
"accelerators": ["cpu"],
"olive_managed_env": true,
"requirements_file": "mnist_requirements.txt"
}
}
}
from olive.systems.docker import DockerSystem
docker_system = DockerSystem(
accelerators=["cpu"],
olive_managed_env=True,
requirements_file="mnist_requirements.txt",
)
Please refer to this example
for "docker"
and "Dockerfile"
.
Important
The docker container must have olive-ai
installed!
Please refer to DockerTargetUserConfig for more details on the config options.
Python Environment System¶
{
"type": "PythonEnvironment",
"config": {
"python_environment_path": "/home/user/.virtualenvs/myenv",
"accelerators": ["cpu"]
}
}
from olive.systems.python_environment import PythonEnvironmentSystem
from olive.system.common import Device
python_environment_system = PythonEnvironmentSystem(
python_environment_path = "/home/user/.virtualenvs/myenv",
device = Device.CPU
)
Olive can also manage the environment by setting olive_managed_env = True
. This feature works best when used from Conda.
{
"type": "PythonEnvironment",
"config": {
"accelerators": ["cpu"],
"olive_managed_env": true,
}
}
from olive.systems.python_environment import PythonEnvironmentSystem
from olive.system.common import Device
python_environment_system = PythonEnvironmentSystem(
olive_managed_env = True,
device = Device.CPU
)
Important
The python environment system can only be used to evaluate onnx models. It must have onnxruntime
installed if olive_managed_env = False
!
Please refer to PythonEnvironmentTargetUserConfig for more details on the config options.