Coverage for mlos_core/mlos_core/__init__.py: 100%
2 statements
« prev ^ index » next coverage.py v7.6.9, created at 2024-12-20 00:44 +0000
« prev ^ index » next coverage.py v7.6.9, created at 2024-12-20 00:44 +0000
1#
2# Copyright (c) Microsoft Corporation.
3# Licensed under the MIT License.
4#
5"""
6mlos_core is a wrapper around other OSS tuning libraries to provide a consistent
7interface for autotuning experimentation.
9``mlos_core`` focuses on the optimization portion of the autotuning process.
11.. contents:: Table of Contents
12 :depth: 3
14Overview
15++++++++
17:py:mod:`mlos_core` can be installed from `pypi <https://pypi.org/project/mlos-core>`_
18with ``pip install mlos-core`` from and provides the main
19:py:mod:`Optimizer <mlos_core.optimizers>` portions of the MLOS project for use with
20autotuning purposes.
21Although it is generally intended to be used with :py:mod:`mlos_bench` to help
22automate the generation of ``(config, score)`` pairs (which we call
23:py:class:`~mlos_core.data_classes.Observations`) to
24:py:meth:`~mlos_core.optimizers.optimizer.BaseOptimizer.register` with the
25Optimizer, it can be used independently as well.
26In that case, a :py:class:`~mlos_core.data_classes.Suggestion` is returned from a
27:py:meth:`~mlos_core.optimizers.optimizer.BaseOptimizer.suggest` call.
28The caller is expected to score the associated config manually (or provide a
29historical value) and :py:meth:`~mlos_core.data_classes.Suggestion.complete` it
30convert it to an :py:class:`~mlos_core.data_classes.Observation` that can be
31registered with the Optimizer before repeating.
32In doing so, the Optimizer will attempt to find the best configuration to minimize
33the score provided, ideally learning from the previous observations in order to
34converge to the best possible configuration as quickly as possible.
36To do this ``mlos_core`` provides a small set of wrapper classes around other OSS
37tuning libraries (e.g.,
38:py:mod:`~mlos_core.optimizers.bayesian_optimizers.smac_optimizer.SmacOptimizer`,
39:py:mod:`~mlos_core.optimizers.flaml_optimizer.FlamlOptimizer`, etc.) in order to
40provide a consistent interface so that the rest of the code
41using it can easily exchange one optimizer for another (or even stack them).
42This allows for easy experimentation with different optimizers, each of which have
43their own strengths and weaknesses.
45When used with :py:mod:`mlos_bench` doing this is as simple as a one line json
46config change for the ``mlos_bench``
47:py:class:`~mlos_bench.optimizers.base_optimizer.Optimizer` config.
49Data Classes
50++++++++++++
52The :py:class:`~mlos_core.data_classes.Suggestion` and
53:py:class:`~mlos_core.data_classes.Observation` :py:mod:`mlos_core.data_classes`
54mentioned above internally use :external:py:mod:`pandas` as the acknowledged lingua
55franca of data science tasks, as is the focus of the ``mlos_core`` package.
57Spaces
58++++++
60In ``mlos_core`` parameter :py:mod:`~mlos_core.spaces` telling the optimizers which
61configs to search over are specified using
62:external:py:class:`ConfigSpace.ConfigurationSpace` s which provide features like
64- log sampling
65- quantization
66- weighted distributions
67- etc.
69Refer to the `ConfigSpace documentation <https://automl.github.org/ConfigSpace/>`_
70for additional details.
72Internally, :py:mod:`~mlos_core.spaces.converters` are used to adapt those to
73whatever the underlying Optimizer needs (in case it isn't using ConfigSpace).
75*However*, note that in :py:mod:`mlos_bench`, a separate
76:py:mod:`~mlos_bench.tunables.tunable_groups.TunableGroups` configuration language
77is currently used instead (which is then internally converted into a
78:py:class:`ConfigSpace.ConfigurationSpace`).
80Space Adapters
81^^^^^^^^^^^^^^
83MLOS also provides :py:mod:`space adapters <mlos_core.spaces.adapters>` to help transform
84one space to another.
86This can be done for a variety for reasons.
88One example is for automatic search space reduction (e.g., using
89:py:mod:`~mlos_core.spaces.adapters.llamatune`) in order to try and improve search
90efficiency (see the :py:mod:`~mlos_core.spaces.adapters.llamatune` and
91:py:mod:`space adapters <mlos_core.spaces.adapters>` modules for additional
92documentation.)
94As with the Optimizers, the Space Adapters are designed to be easily swappable,
95especially in the :py:mod:`mlos_bench`
96:py:class:`~mlos_bench.optimizers.base_optimizer.Optimizer` config.
98Classes Overview
99++++++++++++++++
101- :py:class:`~mlos_core.optimizers.optimizer.BaseOptimizer` is the base class for all Optimizers
103 Its core methods are:
105 - :py:meth:`~mlos_core.optimizers.optimizer.BaseOptimizer.suggest` which returns a
106 new configuration to evaluate
107 - :py:meth:`~mlos_core.optimizers.optimizer.BaseOptimizer.register` which registers
108 a "score" for an evaluated configuration with the Optimizer
110 Each operates on Pandas :py:class:`DataFrames <pandas.DataFrame>` as the lingua
111 franca for data science.
113- :py:meth:`mlos_core.optimizers.OptimizerFactory.create` is a factory function
114 that creates a new :py:type:`~mlos_core.optimizers.ConcreteOptimizer` instance
116 To do this it uses the :py:class:`~mlos_core.optimizers.OptimizerType` enum to
117 specify which underlying optimizer to use (e.g.,
118 :py:class:`~mlos_core.optimizers.OptimizerType.FLAML` or
119 :py:class:`~mlos_core.optimizers.OptimizerType.SMAC`).
121Examples
122--------
123>>> # Import the necessary classes.
124>>> import pandas
125>>> from ConfigSpace import ConfigurationSpace, UniformIntegerHyperparameter
126>>> from mlos_core.optimizers import OptimizerFactory, OptimizerType
127>>> from mlos_core.spaces.adapters import SpaceAdapterFactory, SpaceAdapterType
128>>> # Create a simple ConfigurationSpace with a single integer hyperparameter.
129>>> cs = ConfigurationSpace(seed=1234)
130>>> _ = cs.add(UniformIntegerHyperparameter("x", lower=0, upper=10))
131>>> # Create a new optimizer instance using the SMAC optimizer.
132>>> opt_args = {"seed": 1234, "max_trials": 100}
133>>> space_adapters_kwargs = {} # no additional args for this example
134>>> opt = OptimizerFactory.create(
135... parameter_space=cs,
136... optimization_targets=["y"],
137... optimizer_type=OptimizerType.SMAC, # or FLAML, etc.
138... optimizer_kwargs=opt_args,
139... space_adapter_type=SpaceAdapterType.IDENTITY, # or LLAMATUNE
140... space_adapter_kwargs=space_adapters_kwargs,
141... )
142>>> # Get a new configuration suggestion.
143>>> suggestion = opt.suggest()
144>>> # Examine the suggested configuration.
145>>> assert len(suggestion.config) == 1
146>>> suggestion.config
147x 3
148dtype: object
149>>> # Register the configuration and its corresponding target value
150>>> score = 42 # a made up score
151>>> scores_sr = pandas.Series({"y": score})
152>>> opt.register(suggestion.complete(scores_sr))
153>>> # Get a new configuration suggestion.
154>>> suggestion = opt.suggest()
155>>> suggestion.config
156x 10
157dtype: object
158>>> score = 7 # a better made up score
159>>> # Optimizers minimize by convention, so a lower score is better
160>>> # You can use a negative score to maximize values instead
161>>> #
162>>> # Convert it to a Series again
163>>> scores_sr = pandas.Series({"y": score})
164>>> opt.register(suggestion.complete(scores_sr))
165>>> # Get the best observations.
166>>> observations = opt.get_best_observations()
167>>> # The default is to only return one
168>>> assert len(observations) == 1
169>>> observations.configs
170 x
1710 10
172>>> observations.scores
173 y
1740 7
176Notes
177-----
178See `mlos_core/README.md
179<https://github.com/microsoft/MLOS/tree/main/mlos_core/>`_
180for additional documentation and examples in the source tree.
181"""
182from mlos_core.version import VERSION
184__version__ = VERSION
187if __name__ == "__main__":
188 import doctest
190 doctest.testmod()