mlos_core

mlos_core is a wrapper around other OSS tuning libraries to provide a consistent interface for autotuning experimentation.

mlos_core focuses on the optimization portion of the autotuning process.

Overview

mlos_core can be installed from pypi with pip install mlos-core from and provides the main Optimizer portions of the MLOS project for use with autotuning purposes. Although it is generally intended to be used with mlos_bench to help automate the generation of (config, score) pairs (which we call Observations) to register() with the Optimizer, it can be used independently as well. In that case, a Suggestion is returned from a suggest() call. The caller is expected to score the associated config manually (or provide a historical value) and complete() it convert it to an Observation that can be registered with the Optimizer before repeating. In doing so, the Optimizer will attempt to find the best configuration to minimize the score provided, ideally learning from the previous observations in order to converge to the best possible configuration as quickly as possible.

To do this mlos_core provides a small set of wrapper classes around other OSS tuning libraries (e.g., SmacOptimizer, FlamlOptimizer, etc.) in order to provide a consistent interface so that the rest of the code using it can easily exchange one optimizer for another (or even stack them). This allows for easy experimentation with different optimizers, each of which have their own strengths and weaknesses.

When used with mlos_bench doing this is as simple as a one line json config change for the mlos_bench Optimizer config.

Data Classes

The Suggestion and Observation mlos_core.data_classes mentioned above internally use pandas as the acknowledged lingua franca of data science tasks, as is the focus of the mlos_core package.

Spaces

In mlos_core parameter spaces telling the optimizers which configs to search over are specified using ConfigSpace.ConfigurationSpace s which provide features like

  • log sampling

  • quantization

  • weighted distributions

  • etc.

Refer to the ConfigSpace documentation for additional details.

Internally, converters are used to adapt those to whatever the underlying Optimizer needs (in case it isn’t using ConfigSpace).

However, note that in mlos_bench, a separate TunableGroups configuration language is currently used instead (which is then internally converted into a ConfigSpace.ConfigurationSpace).

Space Adapters

MLOS also provides space adapters to help transform one space to another.

This can be done for a variety for reasons.

One example is for automatic search space reduction (e.g., using llamatune) in order to try and improve search efficiency (see the llamatune and space adapters modules for additional documentation.)

As with the Optimizers, the Space Adapters are designed to be easily swappable, especially in the mlos_bench Optimizer config.

Classes Overview

Examples

>>> # Import the necessary classes.
>>> import pandas
>>> from ConfigSpace import ConfigurationSpace, UniformIntegerHyperparameter
>>> from mlos_core.optimizers import OptimizerFactory, OptimizerType
>>> from mlos_core.spaces.adapters import SpaceAdapterFactory, SpaceAdapterType
>>> # Create a simple ConfigurationSpace with a single integer hyperparameter.
>>> cs = ConfigurationSpace(seed=1234)
>>> _ = cs.add(UniformIntegerHyperparameter("x", lower=0, upper=10))
>>> # Create a new optimizer instance using the SMAC optimizer.
>>> opt_args = {"seed": 1234, "max_trials": 100}
>>> space_adapters_kwargs = {} # no additional args for this example
>>> opt = OptimizerFactory.create(
...     parameter_space=cs,
...     optimization_targets=["y"],
...     optimizer_type=OptimizerType.SMAC,  # or FLAML, etc.
...     optimizer_kwargs=opt_args,
...     space_adapter_type=SpaceAdapterType.IDENTITY,   # or LLAMATUNE
...     space_adapter_kwargs=space_adapters_kwargs,
... )
>>> # Get a new configuration suggestion.
>>> suggestion = opt.suggest()
>>> # Examine the suggested configuration.
>>> assert len(suggestion.config) == 1
>>> suggestion.config
x    3
dtype: object
>>> # Register the configuration and its corresponding target value
>>> score = 42 # a made up score
>>> scores_sr = pandas.Series({"y": score})
>>> opt.register(suggestion.complete(scores_sr))
>>> # Get a new configuration suggestion.
>>> suggestion = opt.suggest()
>>> suggestion.config
x    10
dtype: object
>>> score = 7 # a better made up score
>>> # Optimizers minimize by convention, so a lower score is better
>>> # You can use a negative score to maximize values instead
>>> #
>>> # Convert it to a Series again
>>> scores_sr = pandas.Series({"y": score})
>>> opt.register(suggestion.complete(scores_sr))
>>> # Get the best observations.
>>> observations = opt.get_best_observations()
>>> # The default is to only return one
>>> assert len(observations) == 1
>>> observations.configs
    x
0  10
>>> observations.scores
   y
0  7

Notes

See mlos_core/README.md for additional documentation and examples in the source tree.

Submodules

Attributes

__version__

Package Contents

mlos_core.__version__ = '0.6.1'[source]