PATATUNE

A Framework for Metaheuristic Multi-Objective Optimization for High Energy Physics
Documentation: https://cms-patatrack.github.io/patatune
Source code: https://github.com/cms-patatrack/patatune
PATATUNE is a Python package that provides a framework for multi-objective optimization algorithms, including the Multi-Objective Particle Swarm Optimization (MOPSO) method. Its primary purpose is to automate the optimization of the parameters of user-defined functions. The package has been developed with the needs of CMS and Patatrack in mind.
The key features are:
- Easy to use and learn.
- Pluggable Multi-objective optimization model with Multi-Objective Particle Swarm Optimization implemented via the
MOPSOclass. - Multiple objective definition supported, for any user-defined objective function.
- Support for different parameter types (int, float, bool).
- Built-in metrics for convergence/quality assessment: Generational Distance, Inverted GD, Hypervolume.
- Persistence and checkpointing via
FileManager(save/load pickle, CSV, Zarr); supports resuming runs and per-iteration history export.
Installation
PATATUNE is available on PyPi.
To install it you can simply run:
pip install patatune
If you want to use the latest development version on the main branch:
- Clone this repository
- Navigate into the project directory
-
Install the package and its dependencies using pip:
You can install a project in “editable” or “develop” mode while you’re working on it. When installed as editable, a project can be edited in-place without reinstallation:pip install .
pip install -e .
Requirements
PATATUNE is written for Python 3.9+ and depends on a small set of scientific Python packages. The following are required to run the library:
Optional functionality is provided by extras:
These dependencies are declared in pyproject.toml and can be installed with pip (see Installation below). If you need the optional extras, install with extra:
pip install patatune[extra]
The additional example require additional libraries (matplotlib, pandas). If you want to run them, install with tests:
pip install patatune[tests]
pip install patatune[all]
Example
Currently the package provides the patatune module that defines an optimization algorithm: MOPSO.
PATATUNE relies on a few helper classes to handle configuration and the objective functions. To use this module in your Python projects:
-
Import the required modules:
import patatune -
Define the objective function to be optimized. i.e.:
def f1(x): return 4 * x[0]**2 + 4 * x[1]**2 def f2(x): return (x[0] - 5)**2 + (x[1] - 5)**2 objectives = patatune.ElementWiseObjective([f1, f2]) -
Define the boundaries of the parameters:
lb = [0.0, 0.0] ub = [5.0, 3.0] -
Create the MOPSO object with the configuration of the algorithm
mopso = patatune.MOPSO( objectives, lower_bounds=lb, upper_bounds=ub, num_particles=50, inertia_weight=0.4, cognitive_coefficient=1.5, social_coefficient=2) -
Run the optimization algorithm
pareto = mopso.optimize(num_iterations = 100)
The output will be the archive of optimal solutions found by the algorithm after 100 iterations.
The output is a Python list containing Particle objects (instances of patatune.mopso.particle.Particle).
You can easily extract a compact representation from the returned list. For example:
for p in pareto:
print("position:", p.position, "fitness:", p.fitness)
Example printed output:
id: 0 position: [1.8 1.6] fitness: [ 23.5 21.6]
id: 49 position: [0.0 0.0] fitness: [ 0. 50. ]
id: 18 position: [1.0 1.0] fitness: [ 8.3 31.7]
id: 29 position: [5.0 3.0] fitness: [ 136. 4. ]
id: 16 position: [0.0 0.0] fitness: [ 0. 50. ]
...
Contributing
Contributions are welcome. If you want to contribute, please follow the Contribution guidelines.
License
PATATUNE is distributed under the MPL 2.0 License. Feel free to use, modify, and distribute the code following the terms of the license.