meerqat.train.optim module#
Loss functions, optimizers, and schedulers.
- class meerqat.train.optim.LinearLRWithWarmup(*args, warmup_steps, total_steps, **kwargs)[source]#
Bases:
LambdaLR
Linear learning rate scheduler with linear warmup. Adapted from huggingface/transformers
- Parameters:
*args (additionnal arguments are passed to LambdaLR) –
**kwargs (additionnal arguments are passed to LambdaLR) –
warmup_steps (int) –
total_steps (int) –