About 54 results
Open links in new tab
  1. Optimizers - Keras

    Keras documentation: Optimizers Abstract optimizer base class. If you intend to create your own optimization algorithm, please inherit from this class and override the following methods: build: …

  2. Optimizers - Keras

    Apply gradients to variables. Arguments grads_and_vars: List of (gradient, variable) pairs. name: string, defaults to None. The name of the namescope to use when creating variables. If None, self.name will …

  3. Adam - Keras

    Optimizer that implements the Adam algorithm. Adam optimization is a stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments. According to …

  4. Muon - Keras

    Arguments learning_rate: A float, keras.optimizers.schedules.LearningRateSchedule instance, or a callable that takes no arguments and returns the actual value to use. The learning rate. Defaults to …

  5. SGD - Keras

    Arguments learning_rate: A float, a keras.optimizers.schedules.LearningRateSchedule instance, or a callable that takes no arguments and returns the actual value to use. The learning rate. Defaults to …

  6. LearningRateSchedule - Keras

    A LearningRateSchedule instance can be passed in as the learning_rate argument of any optimizer. To implement your own schedule object, you should implement the __call__ method, which takes a step …

  7. Learning rate schedules API - Keras

    Keras documentation: Learning rate schedules API Learning rate schedules API LearningRateSchedule ExponentialDecay PiecewiseConstantDecay PolynomialDecay InverseTimeDecay CosineDecay …

  8. Adamax - Keras

    Optimizer that implements the Adamax algorithm. Adamax, a variant of Adam based on the infinity norm, is a first-order gradient-based optimization method. Due to its capability of adjusting the learning rate …

  9. AdamW - Keras

    Optimizer that implements the AdamW algorithm. AdamW optimization is a stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments with an added …

  10. Training & evaluation with the built-in methods - Keras

    Mar 1, 2019 · Many built-in optimizers, losses, and metrics are available In general, you won't have to create your own losses, metrics, or optimizers from scratch, because what you need is likely to be …