📄️ Llama Trainer
The Llama Trainer allows you to create and manage training jobs using Transformer Lab. This plugin currently supports CUDA environments and offers flexibility in setting up custom training tasks.
📄️ GRPO Trainer
The GRPO Trainer plugin allows you to create and manage GRPO training jobs using Transformer Lab. Installation steps remain the same as other training plugins.
📄️ Pre-Training
The Nanotron Pre-training Framework plugin allows you to pre-train models on a single or multi-GPU setup using Transformer Lab. After training, the model will be available in the Foundation tab for further preference training or chatting. It is uses Nanotron for pre-training.
📄️ Diffusion Trainer
The Diffusion Trainer allows you to create and manage LoRA training jobs for diffusion models using Transformer Lab. This plugin enables training custom adaptors that can be used with Text-to-Image, Image-to-Image, and Inpainting workflows. The trainer supports CUDA environments and offers flexibility in setting up custom diffusion training tasks.