evox.algorithms.so.es_variants.asebo
¶
Module Contents¶
Classes¶
The implementation of the ASEBO algorithm. |
API¶
- class evox.algorithms.so.es_variants.asebo.ASEBO(pop_size: int, center_init: torch.Tensor, optimizer: Literal[adam] | None = None, lr: float = 0.05, lr_decay: float = 1.0, lr_limit: float = 0.001, sigma: float = 0.03, sigma_decay: float = 1.0, sigma_limit: float = 0.01, subspace_dims: int | None = None, device: torch.device | None = None)[source]¶
Bases:
evox.core.Algorithm
The implementation of the ASEBO algorithm.
Reference: From Complexity to Simplicity: Adaptive ES-Active Subspaces for Blackbox Optimization (https://arxiv.org/abs/1903.04268)
This code has been inspired by or utilizes the algorithmic implementation from evosax. More information about evosax can be found at the following URL: GitHub Link: https://github.com/RobertTLange/evosax
Initialization
Initialize the ARS algorithm with the given parameters.
- Parameters:
pop_size – The size of the population.
center_init – The initial center of the population. Must be a 1D tensor.
optimizer – The optimizer to use. Defaults to None. Currently, only “adam” or None is supported.
lr – The learning rate for the optimizer. Defaults to 0.05.
lr_decay – The decay factor for the learning rate. Defaults to 1.0.
lr_limit – The minimum value for the learning rate. Defaults to 0.001.
sigma – The standard deviation of the noise. Defaults to 0.03.
sigma_decay – The decay factor for the standard deviation. Defaults to 1.0.
sigma_limit – The minimum value for the standard deviation. Defaults to 0.01.
subspace_dims – The dimension of the subspace. Defaults to None.
device – The device to use for the tensors. Defaults to None.
- step()[source]¶
The main step of the ASEBO algorithm.
This function first computes the subspace spanned by the gradient of the fitness function and then projects the gradient onto the subspace. It then computes the step direction using the projected gradient and updates the center and standard deviation of the search distribution.