OliveModels¶
The following models are available in Olive.
ONNX Model¶
- class olive.model.ONNXModel(model_path: str | None = None, name: str | None = None, version: int | None = None, aml_storage_name: str | None = None, model_storage_kind: str | ModelStorageKind = ModelStorageKind.LocalFile, inference_settings: dict | None = None, use_ort_extensions: bool = False, hf_config: Dict[str, Any] | HFConfig | None = None)[source]¶
CompositeOnnxModel Model¶
- class olive.model.CompositeOnnxModel(model_components: List[str], name: str | None = None, version: int | None = None, aml_storage_name: str | None = None, hf_config: Dict[str, Any] | HFConfig | None = None)[source]¶
CompositeOnnxModel represents multi component models. Whisper is an example composite model that has encoder and decoder components. CompositeOnnxModel is a collection of OnnxModels.
DistributedOnnxModel Model¶
OpenVINO Model¶
PyTorch Model¶
- class olive.model.PyTorchModel(model_path: str | None = None, model_file_format: ModelFileFormat = ModelFileFormat.PYTORCH_ENTIRE_MODEL, name: str | None = None, version: int | None = None, aml_storage_name: str | None = None, model_storage_kind: str | ModelStorageKind = ModelStorageKind.LocalFolder, model_loader: str | Callable | None = None, model_script: Path | str | None = None, script_dir: Path | str | None = None, io_config: Dict[str, Any] | IOConfig | None = None, dummy_inputs_func: str | Callable | None = None, hf_config: Dict[str, Any] | HFConfig | None = None)[source]¶