LoraConfig¶
- class verta.finetune.LoraConfig(alpha: int = 32, dropout: float = 0.0, r: int = 8)¶
LoRA fine-tuning configuration.
For use with
RegisteredModelVersion.finetune()
- Parameters:
alpha (positive int, default 32) – Scaling factor for update matrices.
dropout (float between 0.0 and 1.0 inclusive, default 0.0) – Dropout probability for LoRA layers.
r (positive int, default 8) – Rank of update matrices.