helios.optim.utils¶
Attributes¶
Global instance of the registry for optimizers. |
Functions¶
|
Create the optimizer for the given type. |
Module Contents¶
- helios.optim.utils.OPTIMIZER_REGISTRY¶
Global instance of the registry for optimizers.
By default, the registry contains the following optimizers:
¶ Optimizer
Name
torch.optim.Adadelta
Adadelta
torch.optim.Adagrad
Adagrad
torch.optim.Adam
Adam
torch.optim.AdamW
AdamW
torch.optim.SparseAdam
SparseAdam
torch.optim.Adamax
Adamax
torch.optim.ASGD
ASGD
torch.optim.LBFGS
LBFGS
torch.optim.NAdam
NAdam
torch.optim.RAdam
RAdam
torch.optim.RMSprop
RMSprop
torch.optim.Rprop
Rprop
torch.optim.SGD
SGD
Example
import helios.optim as hlo # This automatically registers your optimizer. @hlo.OPTIMIZER_REGISTRY.register class MyOptimizer: ... # Alternatively you can manually register a optimizer like this: hlo.OPTIMIZER_REGISTRY.register(MyOptimizer)
- helios.optim.utils.create_optimizer(type_name: str, *args: Any, **kwargs: Any) torch.nn.Module [source]¶
Create the optimizer for the given type.
- Parameters:
type_name – the type of the optimizer to create.
args – positional arguments to pass into the optimizer.
kwargs – keyword arguments to pass into the optimizer.
- Returns:
The optimizer.