flux.1 lite 8 B alpha
FreepikIntroduction
Flux.1 Lite is an 8B parameter transformer model, derived from the FLUX.1-dev model. It is designed for text-to-image generation while optimizing performance by using 7 GB less RAM and running 23% faster.
Architecture
Flux.1 Lite maintains the same precision (bfloat16) as the original FLUX.1-dev model. The model features a distilled architecture that reduces resource usage without compromising on image generation quality.
Training
The model's development involved analyzing the mean squared error (MSE) between input and output blocks. This analysis revealed that skipping certain early or late blocks impacts performance, while skipping blocks in between does not significantly affect image quality. These insights have guided the distillation process of the model.
Guide: Running Locally
To run Flux.1 Lite locally, follow these steps:
-
Install Dependencies: Ensure you have PyTorch and the
diffusers
library installed. -
Prepare Environment: Use a CUDA-enabled device for optimal performance.
-
Load Model: Use the following code snippet to load and run the model:
import torch from diffusers import FluxPipeline base_model_id = "Freepik/flux.1-lite-8B-alpha" torch_dtype = torch.bfloat16 device = "cuda" pipe = FluxPipeline.from_pretrained( model_id, torch_dtype=torch_dtype ).to(device) prompt = "A close-up image of a green alien with fluorescent skin in the middle of a dark purple forest" guidance_scale = 3.5 n_steps = 28 seed = 11 with torch.inference_mode(): image = pipe( prompt=prompt, generator=torch.Generator(device="cpu").manual_seed(seed), num_inference_steps=n_steps, guidance_scale=guidance_scale, height=1024, width=1024, ).images[0] image.save("output.png")
-
Cloud GPU Recommendation: For enhanced performance, consider using cloud GPU services like AWS, Google Cloud, or Azure.
License
Flux.1 Lite is released under the FLUX.1 [dev] Non-Commercial License. All model weights and usage are subject to the terms outlined in this license. For detailed license information, visit the license page.