Coverage for mlos_core/mlos_core/__init__.py: 100%
2 statements
« prev ^ index » next coverage.py v7.6.7, created at 2024-11-22 01:18 +0000
« prev ^ index » next coverage.py v7.6.7, created at 2024-11-22 01:18 +0000
1#
2# Copyright (c) Microsoft Corporation.
3# Licensed under the MIT License.
4#
5"""
6mlos_core is a wrapper around other OSS tuning libraries to provide a consistent
7interface for autotuning experimentation.
9:py:mod:`mlos_core` can be installed from `pypi <https://pypi.org/project/mlos-core>`_
10with ``pip install mlos-core`` from and provides the main
11:py:mod:`Optimizer <mlos_core.optimizers>` portions of the MLOS project for use with
12autotuning purposes.
13Although it is generally intended to be used with :py:mod:`mlos_bench` to help
14automate the generation of ``(config, score)`` pairs to register with the Optimizer,
15it can be used independently as well.
17To do this it provides a small set of wrapper classes around other OSS tuning
18libraries in order to provide a consistent interface so that the rest of the code
19using it can easily exchange one optimizer for another (or even stack them).
21Specifically:
23- :py:class:`~mlos_core.optimizers.optimizer.BaseOptimizer` is the base class for all Optimizers
25 Its core methods are:
27 - :py:meth:`~mlos_core.optimizers.optimizer.BaseOptimizer.suggest` which returns a
28 new configuration to evaluate
29 - :py:meth:`~mlos_core.optimizers.optimizer.BaseOptimizer.register` which registers
30 a "score" for an evaluated configuration with the Optimizer
32 Each operates on Pandas :py:class:`DataFrames <pandas.DataFrame>` as the lingua
33 franca for data science.
35- :py:meth:`mlos_core.optimizers.OptimizerFactory.create` is a factory function
36 that creates a new :py:type:`~mlos_core.optimizers.ConcreteOptimizer` instance
38 To do this it uses the :py:class:`~mlos_core.optimizers.OptimizerType` enum to
39 specify which underlying optimizer to use (e.g.,
40 :py:class:`~mlos_core.optimizers.OptimizerType.FLAML` or
41 :py:class:`~mlos_core.optimizers.OptimizerType.SMAC`).
43Examples
44--------
45>>> # Import the necessary classes.
46>>> import pandas
47>>> from ConfigSpace import ConfigurationSpace, UniformIntegerHyperparameter
48>>> from mlos_core.optimizers import OptimizerFactory, OptimizerType
49>>> from mlos_core.spaces.adapters import SpaceAdapterFactory, SpaceAdapterType
50>>> # Create a simple ConfigurationSpace with a single integer hyperparameter.
51>>> cs = ConfigurationSpace(seed=1234)
52>>> _ = cs.add(UniformIntegerHyperparameter("x", lower=0, upper=10))
53>>> # Create a new optimizer instance using the SMAC optimizer.
54>>> opt_args = {"seed": 1234, "max_trials": 100}
55>>> space_adpaters_kwargs = {} # no additional args for this example
56>>> opt = OptimizerFactory.create(
57... parameter_space=cs,
58... optimization_targets=["y"],
59... optimizer_type=OptimizerType.SMAC,
60... optimizer_kwargs=opt_args,
61... space_adapter_type=SpaceAdapterType.IDENTITY, # or LLAMATUNE
62... space_adapter_kwargs=space_adpaters_kwargs,
63... )
64>>> # Get a new configuration suggestion.
65>>> (config_df, _metadata_df) = opt.suggest()
66>>> # Examine the suggested configuration.
67>>> assert len(config_df) == 1
68>>> config_df.iloc[0]
69x 3
70Name: 0, dtype: int64
71>>> # Register the configuration and its corresponding target value
72>>> score = 42 # a made up score
73>>> scores_df = pandas.DataFrame({"y": [score]})
74>>> opt.register(configs=config_df, scores=scores_df)
75>>> # Get a new configuration suggestion.
76>>> (config_df, _metadata_df) = opt.suggest()
77>>> config_df.iloc[0]
78x 10
79Name: 0, dtype: int64
80>>> score = 7 # a better made up score
81>>> # Optimizers minimize by convention, so a lower score is better
82>>> # You can use a negative score to maximize values instead
83>>> #
84>>> # Convert it to a DataFrame again
85>>> scores_df = pandas.DataFrame({"y": [score]})
86>>> opt.register(configs=config_df, scores=scores_df)
87>>> # Get the best observations.
88>>> (configs_df, scores_df, _contexts_df) = opt.get_best_observations()
89>>> # The default is to only return one
90>>> assert len(configs_df) == 1
91>>> assert len(scores_df) == 1
92>>> configs_df.iloc[0]
93x 10
94Name: 1, dtype: int64
95>>> scores_df.iloc[0]
96y 7
97Name: 1, dtype: int64
99Notes
100-----
101See `mlos_core/README.md
102<https://github.com/microsoft/MLOS/tree/main/mlos_core/>`_
103for additional documentation and examples in the source tree.
104"""
105from mlos_core.version import VERSION
107__version__ = VERSION
110if __name__ == "__main__":
111 import doctest
113 doctest.testmod()