The hyperparameters for training image classifiers.

Inherits From: BaseHParams

learning_rate Learning rate to use for gradient descent training.
batch_size Batch size for training.
epochs Number of training iterations over the dataset.
do_fine_tuning If true, the base module is trained together with the classification layer on top.
l1_regularizer A regularizer that applies a L1 regularization penalty.
l2_regularizer A regularizer that applies a L2 regularization penalty.
label_smoothing Amount of label smoothing to apply. See tf.keras.losses for more details.
do_data_augmentation A boolean controlling whether the training dataset is augmented by randomly distorting input images, including random cropping, flipping, etc. See utils.image_preprocessing documentation for details.
decay_samples Number of training samples used to calculate the decay steps and create the training optimizer.
warmup_steps Number of warmup steps for a linear increasing warmup schedule on learning rate. Used to set up warmup schedule by model_util.WarmUp.s
steps_per_epoch Dataclass field
class_weights Dataclass field
shuffle Dataclass field
export_dir Dataclass field
distribution_strategy Dataclass field
num_gpus Dataclass field
tpu Dataclass field
warmup_epochs Dataclass field



View source


batch_size 2
class_weights None
decay_samples 2560000
distribution_strategy 'off'
do_data_augmentation True
do_fine_tuning False
epochs 10
export_dir '/tmpfs/tmp/tmpt8l382tf'
l1_regularizer 0.0
l2_regularizer 0.0001
label_smoothing 0.1
learning_rate 0.001
num_gpus 0
shuffle False
steps_per_epoch None
tpu ''
warmup_epochs 2