StochasticOptimizer type
Defines values for StochasticOptimizer.
KnownStochasticOptimizer can be used interchangeably with StochasticOptimizer,
this enum contains the known values that the service supports.
Known values supported by the service
None: No optimizer selected.
Sgd: Stochastic Gradient Descent optimizer.
Adam: Adam is algorithm the optimizes stochastic objective functions based on adaptive estimates of moments
Adamw: AdamW is a variant of the optimizer Adam that has an improved implementation of weight decay.
type StochasticOptimizer = string