# Adam Optimizer
Configuration of the gradient-based Adam optimizer.
# Stopping Criterion
Number of Iterations: Number of steps in the gradient descent optimizer. The optimization will stop after performing the number of iterations set here.
Type: integer
- Unit: unitless
- Constraint: Greater than 0
- Default: 10
- Required field
# Specification
Optimization Direction: If Maximize, the optimizer will maximize the objective function. If Minimize, the optimizer will minimize the objective function.
Type: boolean
- Default: Maximize
Learning Rate: Step size for the gradient descent optimizer.
Type: floating-point number
- Unit: unitless
- Constraint: Greater than 0
- Default: 0.01
- Required field
# Advanced
Beta1: Beta 1 hyperparameter in the Adam optimization method.
Type: floating-point number
- Unit: unitless
- Default: 0.9
Beta1: Beta 2 hyperparameter in the Adam optimization method.
Type: floating-point number
- Unit: unitless
- Default: 0.999
Eps: Epsilon hyperparameter in the Adam optimization method.
Type: floating-point number
- Unit: unitless
- Default: 1e-8
Store Full Results: If True, stores the full history for the vector fields, specifically the gradient, params, and optimizer state. For large design regions and many iterations, storing the full history of these fields can lead to large file size and memory usage. In some cases, we recommend setting this field to False, which will only store the last computed state of these variables.
Type: boolean
- Default: True