View source on GitHub |
Additional optimizers that conform to Keras API.
Classes
class AdaBelief
: Variant of the Adam optimizer.
class AdamW
: Optimizer that implements the Adam algorithm with weight decay.
class AveragedOptimizerWrapper
: Base class for legacy Keras optimizers.
class COCOB
: Optimizer that implements COCOB Backprop Algorithm
class ConditionalGradient
: Optimizer that implements the Conditional Gradient optimization.
class CyclicalLearningRate
: A LearningRateSchedule that uses cyclical schedule.
class DecoupledWeightDecayExtension
: This class allows to extend optimizers with decoupled weight decay.
class ExponentialCyclicalLearningRate
: A LearningRateSchedule that uses cyclical schedule.
class LAMB
: Optimizer that implements the Layer-wise Adaptive Moments (LAMB).
class LazyAdam
: Variant of the Adam optimizer that handles sparse updates more efficiently.
class Lookahead
: This class allows to extend optimizers with the lookahead mechanism.
class MovingAverage
: Optimizer that computes a moving average of the variables.
class MultiOptimizer
: Multi Optimizer Wrapper for Discriminative Layer Training.
class NovoGrad
: Optimizer that implements NovoGrad.
class ProximalAdagrad
: Optimizer that implements the Proximal Adagrad algorithm.
class RectifiedAdam
: Variant of the Adam optimizer whose adaptive learning rate is rectified so as to have a consistent variance.
class SGDW
: Optimizer that implements the Momentum algorithm with weight_decay.
class SWA
: This class extends optimizers with Stochastic Weight Averaging (SWA).
class Triangular2CyclicalLearningRate
: A LearningRateSchedule that uses cyclical schedule.
class TriangularCyclicalLearningRate
: A LearningRateSchedule that uses cyclical schedule.
class Yogi
: Optimizer that implements the Yogi algorithm in Keras.
Functions
extend_with_decoupled_weight_decay(...)
: Factory function returning an optimizer class with decoupled weight decay.