Coverage for mlos_bench/mlos_bench/optimizers/__init__.py: 100%
7 statements
« prev ^ index » next coverage.py v7.8.0, created at 2025-04-01 00:52 +0000
« prev ^ index » next coverage.py v7.8.0, created at 2025-04-01 00:52 +0000
1#
2# Copyright (c) Microsoft Corporation.
3# Licensed under the MIT License.
4#
5"""
6Interfaces and wrapper classes for optimizers to be used in :py:mod:`mlos_bench` for
7autotuning or benchmarking.
9Overview
10++++++++
12One of the main purposes of the mlos_bench :py:class:`.Optimizer` class is to
13provide a wrapper for the :py:mod:`mlos_core.optimizers` via the
14:py:class:`.MlosCoreOptimizer` in order to perform autotuning.
16However, several other *config suggesters* that conform to the Optimizer APIs are
17also available for use:
19- :py:class:`.GridSearchOptimizer` :
20 Useful for exhaustive search of a *small* parameter space.
21- :py:class:`.OneShotOptimizer` :
22 Useful for one-off config experimentation and benchmarking.
23- :py:class:`.ManualOptimizer` :
24 Useful for repeatedly testing a small set of known configs.
26API
27+++
29Like the mlos_core :py:class:`~mlos_core.optimizers.optimizer.BaseOptimizer`, the
30core APIs here are :py:meth:`.Optimizer.suggest` and :py:meth:`.Optimizer.register`.
32The :py:meth:`.Optimizer.bulk_register` method is also available to pre-warm a new
33Optimizer instance using observations from a prior set of
34:py:class:`~mlos_bench.storage.base_storage.Storage.Trial` runs (e.g., from the
35:py:mod:`mlos_bench.storage`).
37.. note::
38 We also refer to this as "merging" this only makes sense if the past Trials
39 were run from a set of Experiments *compatible* with this one (e.g., same
40 software, workload, VM size, overlapping parameter spaces, etc.).
41 Automatically determining whether that makes sense to do is challenging and
42 is left to the user to ensure for now.
44Stopping Conditions
45^^^^^^^^^^^^^^^^^^^
46Currently the :py:meth:`.Optimizer.not_converged` method only checks that the number
47of suggestions is less than the ``max_suggestions`` property of the Optimizer
48config.
50However, in the future we intend to implement more sophisticated stopping conditions
51(e.g., total time, convergence, cost budget, etc.).
53Spaces
54++++++
56Unlike mlos_core, the :py:mod:`mlos_bench.optimizers` operate on
57:py:mod:`~mlos_bench.tunables` instead of :py:class:`ConfigSpace.ConfigurationSpace`
58instances, so mlos_bench handles conversions internally (see
59:py:mod:`mlos_bench.optimizers.convert_configspace`).
61Space Adapters
62^^^^^^^^^^^^^^
64When using the :py:class:`.MlosCoreOptimizer`, you can also specify a
65``space_adapter_type`` to use for manipulating the configuration space into
66something that may help the Optimizer find better configurations more quickly
67(e.g., by automatically doing space reduction).
69See the :py:mod:`mlos_core.spaces.adapters` module for more information.
71Config
72++++++
74Typically these tunables are combined from the individual Environments they are
75associated with and loaded via JSON config files.
77In the Examples used within this module's documentation we will simply represent
78them as JSON strings for explanatory purposes.
80Several properties are common to all Optimizers, but some are specific to the
81Optimizer being used.
82The JSON schemas control what is considered a valid configuration for an Optimizer.
83In the case of an :py:class:`.MlosCoreOptimizer`, the valid options can often be
84inferred from the constructor arguments of the corresponding
85:py:class:`mlos_core.optimizers` class.
87Similarly for the SpaceAdapterType, the valid options can be inferred from the
88individual :py:mod:`mlos_core.spaces.adapters` class constructors.
90Generally speaking though the JSON config for an Optimizer will look something
91like the following:
93.. code-block:: json
95 {
96 // One of the mlos_bench Optimizer classes from this module.
97 "class": "mlos_bench.optimizers.mlos_core_optimizer.MlosCoreOptimizer",
99 "description": "MlosCoreOptimizer",
101 // Optional configuration properties for the selected Optimizer class.
102 "config": {
103 // Common properties for all Optimizers:
104 "max_suggestions": 1000,
105 "optimization_targets": {
106 // Your optimization target(s) mapped to their respective
107 // optimization goals.
108 "throughput": "max",
109 "cost": "min",
110 },
111 "start_with_defaults": true,
112 "seed": 42,
114 // Now starts a collection of key-value pairs that are specific to
115 // the Optimizer class chosen.
117 // Override the default optimizer type.
118 // Must be one of the mlos_core OptimizerType enum values.
119 "optimizer_type": "SMAC", // e.g., "RANDOM", "FLAML", "SMAC"
121 // Optionally provide some additional configuration options for the optimizer.
122 // Note: these are optimizer-specific and may not be supported by all optimizers.
123 // For instance the following example is only supported by the SMAC optimizer.
124 // In general, for MlosCoreOptimizers you can look at the arguments
125 // to the corresponding OptimizerType in the mlos_core module.
126 "n_random_init": 20,
127 "n_random_probability": 0.25, // increased to prioritize exploration
129 // In the case of an MlosCoreOptimizer, override the default space
130 // adapter type.
131 // Must be one of the mlos_core SpaceAdapterType enum values.
132 // e.g., LlamaTune is a method for automatically doing space reduction
133 // from the original space.
134 "space_adapter_type": "LLAMATUNE",
135 "space_adapter_config": {
136 // Optional space adapter configuration.
137 // The JSON schema controls the valid properties here.
138 // In general check the constructor arguments of the specified
139 // SpaceAdapterType.
140 "num_low_dims": 10,
141 "max_unique_values_per_param": 20,
142 },
143 }
145However, it can also be as simple as the following and sane defaults will be
146used for the rest.
148.. code-block:: json
150 {
151 "class": "mlos_bench.optimizers.MlosCoreOptimizer"
152 }
154Or to only override the space adapter type:
156.. code-block:: json
158 {
159 "class": "mlos_bench.optimizers.MlosCoreOptimizer",
160 "config": {
161 "space_adapter_type": "LLAMATUNE"
162 }
163 }
165Or, to use a different class for suggesting configurations:
167.. code-block:: json
169 {
170 "class": "mlos_bench.optimizers.GridSearchOptimizer"
171 }
173Notes
174-----
175The full set of supported properties is specified in the `JSON schemas for optimizers
176<https://github.com/microsoft/MLOS/blob/main/mlos_bench/mlos_bench/config/schemas/optimizers/>`_.
177and can be seen in some of the `test examples in the source tree
178<https://github.com/microsoft/MLOS/tree/main/mlos_bench/mlos_bench/tests/config/schemas/optimizers/test-cases/good/>`_.
180See Also
181--------
182:py:mod:`mlos_bench.config` :
183 For more information about the mlos_bench configuration system.
185Examples
186--------
187Note: All of the examples in this module are expressed in Python for testing
188purposes.
190Load tunables from a JSON string.
191Note: normally these would be automatically loaded from the
192:py:mod:`~mlos_bench.environments.base_environment.Environment`'s
193``include_tunables`` config parameter.
195>>> import json5 as json
196>>> from mlos_bench.environments.status import Status
197>>> from mlos_bench.services.config_persistence import ConfigPersistenceService
198>>> service = ConfigPersistenceService()
199>>> json_config = '''
200... {
201... "group_1": {
202... "cost": 1,
203... "params": {
204... "flags": {
205... "type": "categorical",
206... "values": ["on", "off", "auto"],
207... "default": "auto",
208... },
209... "int_param": {
210... "type": "int",
211... "range": [1, 100],
212... "default": 10,
213... },
214... "float_param": {
215... "type": "float",
216... "range": [0, 100],
217... "default": 50.0,
218... }
219... }
220... }
221... }
222... '''
223>>> tunables = service.load_tunables(jsons=[json_config])
224>>> # Here's the defaults:
225>>> tunables.get_param_values()
226{'flags': 'auto', 'int_param': 10, 'float_param': 50.0}
228Next we'll load an Optimizer from a JSON string.
230At a minimum, the JSON config must specify the Optimizer ``class`` to use (e.g.,
231one of the classes from this module).
233(e.g., ``"class": "mlos_bench.optimizers.MlosCoreOptimizer"``)
235>>> # All optimizers support the following optional config properties at a
236>>> # minimum:
237>>> sorted(Optimizer.BASE_SUPPORTED_CONFIG_PROPS)
238['max_suggestions', 'optimization_targets', 'seed', 'start_with_defaults']
240When using the :py:class:`.MlosCoreOptimizer`, we can also specify some
241additional properties, for instance the ``optimizer_type``, which is one of the
242mlos_core :py:data:`~mlos_core.optimizers.OptimizerType` enum values:
244>>> import mlos_core.optimizers
245>>> print([member.name for member in mlos_core.optimizers.OptimizerType])
246['RANDOM', 'FLAML', 'SMAC']
248These may also include their own configuration options, which can be specified
249as additional key-value pairs in the ``config`` section, where each key-value
250corresponds to an argument to the respective OptimizerTypes's constructor.
251See :py:meth:`mlos_core.optimizers.OptimizerFactory.create` for more details.
253Other Optimizers may also have their own configuration options.
254See each class' documentation for details.
256When using :py:class:`.MlosCoreOptimizer`, we can also specify an optional an
257``space_adapter_type``, which can sometimes help manipulate the configuration
258space to something more manageable. It should be one of the following
259:py:data:`~mlos_core.spaces.adapters.SpaceAdapterType` enum values:
261>>> import mlos_core.spaces.adapters
262>>> print([member.name for member in mlos_core.spaces.adapters.SpaceAdapterType])
263['IDENTITY', 'LLAMATUNE']
265These may also include their own configuration options, which can be specified
266as additional key-value pairs in the optional ``space_adapter_config`` section,
267where each key-value corresponds to an argument to the respective
268OptimizerTypes's constructor. See
269:py:meth:`mlos_core.spaces.adapters.SpaceAdapterFactory.create` for more details.
271Here's an example JSON config for an :py:class:`.MlosCoreOptimizer`.
273>>> optimizer_json_config = '''
274... {
275... "class": "mlos_bench.optimizers.mlos_core_optimizer.MlosCoreOptimizer",
276... "description": "MlosCoreOptimizer",
277... "config": {
278... "max_suggestions": 1000,
279... "optimization_targets": {
280... "throughput": "max",
281... "cost": "min",
282... },
283... "start_with_defaults": true,
284... "seed": 42,
285... // Override the default optimizer type
286... // Must be one of the mlos_core OptimizerType enum values.
287... "optimizer_type": "SMAC",
288... // Optionally provide some additional configuration options for the optimizer.
289... // Note: these are optimizer-specific and may not be supported by all optimizers.
290... "n_random_init": 25,
291... "n_random_probability": 0.01,
292... // Optionally override the default space adapter type
293... // Must be one of the mlos_core SpaceAdapterType enum values.
294... // LlamaTune is a method for automatically doing space reduction
295... // from the original space.
296... /* Not enabled for this example:
297... "space_adapter_type": "LLAMATUNE",
298... "space_adapter_config": {
299... // Note: these values are probably too low,
300... // but it's just for demonstration.
301... "num_low_dims": 2,
302... "max_unique_values_per_param": 10,
303... },
304... */
305... }
306... }
307... '''
309That config will typically be loaded via the ``--optimizer`` command-line
310argument to the :py:mod:`mlos_bench <mlos_bench.run>` CLI.
311However, for demonstration purposes, we can load it directly here:
313>>> config = json.loads(optimizer_json_config)
314>>> optimizer = service.build_optimizer(
315... tunables=tunables,
316... service=service,
317... config=config,
318... )
320Now the :py:mod:`mlos_bench.schedulers` can use the selected
321:py:class:`.Optimizer` to :py:meth:`.Optimizer.suggest` a new config to test in
322a Trial and then :py:meth:`.Optimizer.register` the results.
324A stripped down example of how this might look in practice is something like
325this:
327>>> suggested_config_1 = optimizer.suggest()
328>>> # Default should be suggested first, per json config.
329>>> suggested_config_1.get_param_values()
330{'flags': 'auto', 'int_param': 10, 'float_param': 50.0}
331>>> # Get another suggestion.
332>>> # Note that multiple suggestions can be pending prior to
333>>> # registering their scores, supporting parallel trial execution.
334>>> suggested_config_2 = optimizer.suggest()
335>>> suggested_config_2.get_param_values()
336{'flags': 'auto', 'int_param': 99, 'float_param': 5.8570134453475}
337>>> # Register some scores.
338>>> # Note: Maximization problems track negative scores to produce a minimization problem.
339>>> optimizer.register(suggested_config_1, Status.SUCCEEDED, {"throughput": 42, "cost": 19})
340{'throughput': -42.0, 'cost': 19.0}
341>>> optimizer.register(suggested_config_2, Status.SUCCEEDED, {"throughput": 7, "cost": 17.2})
342{'throughput': -7.0, 'cost': 17.2}
343>>> (best_score, best_config) = optimizer.get_best_observation()
344>>> best_score
345{'throughput': 42.0, 'cost': 19.0}
346>>> assert best_config == suggested_config_1
347"""
349from mlos_bench.optimizers.base_optimizer import Optimizer
350from mlos_bench.optimizers.grid_search_optimizer import GridSearchOptimizer
351from mlos_bench.optimizers.manual_optimizer import ManualOptimizer
352from mlos_bench.optimizers.mlos_core_optimizer import MlosCoreOptimizer
353from mlos_bench.optimizers.mock_optimizer import MockOptimizer
354from mlos_bench.optimizers.one_shot_optimizer import OneShotOptimizer
356__all__ = [
357 "GridSearchOptimizer",
358 "ManualOptimizer",
359 "MlosCoreOptimizer",
360 "MockOptimizer",
361 "OneShotOptimizer",
362 "Optimizer",
363]