Training
training
High level orchestrator for model training
Warning
This module is incomplete. they only contain annotations for future use.
Attributes:
Name | Type | Description |
---|---|---|
Epoch |
TypeAlias
|
The number of times the model has seen the entire training dataset. |
TrainingBatchSize |
TypeAlias
|
Number of training examples (audio chunks) processed before a weight update. |
GradientAccumulationSteps |
TypeAlias
|
Number of batches to process before performing a weight update. |
OptimizerName |
TypeAlias
|
Algorithm used to update the model weights to minimize the loss function. |
LearningRateSchedulerName |
TypeAlias
|
Algorithm used to adjust the learning rate during training. |
UseAutomaticMixedPrecision |
TypeAlias
|
Whether to use automatic mixed precision (AMP) during training. |
UseLoRA |
TypeAlias
|
Whether to use Low-Rank Adaptation for efficient fine-tuning. |
Epoch
module-attribute
The number of times the model has seen the entire training dataset.
TrainingBatchSize
module-attribute
Number of training examples (audio chunks) processed before a weight update.
Larger batches may offer more stable gradients but require more memory.
GradientAccumulationSteps
module-attribute
Number of batches to process before performing a weight update.
This simulates a larger batch size without increasing memory, e.g., a batch size of 4 with 8 accumulation steps has an effective batch size of 32.
OptimizerName
module-attribute
Algorithm used to update the model weights to minimize the loss function.
LearningRateSchedulerName
module-attribute
Algorithm used to adjust the learning rate during training.
e.g. ReduceLROnPlateau
can reduce the learning rate when a metric stops improving.
UseAutomaticMixedPrecision
module-attribute
Whether to use automatic mixed precision (AMP) during training.
UseLoRA
module-attribute
Whether to use Low-Rank Adaptation for efficient fine-tuning.
This freezes pre-trained weights and injects smaller, trainable low-rank matrices, dramatically reducing the number of trainable parameters.