Canopus Cute Kawaii Flux Lo R A
prithivMLmodsIntroduction
The Canopus-Cute-Kawaii-Flux-LoRA model is a text-to-image generation model designed to produce animated, kawaii-style images. It utilizes the LoRA (Low-Rank Adaptation) technique and is built on top of the FLUX.1-dev base model. The model is still under development, which may result in variable performance.
Architecture
The model operates using several key parameters:
- Base Model:
black-forest-labs/FLUX.1-dev
- Optimizer: AdamW8bit
- Learning Rate Scheduler: Constant
- Network Dimensions: 64
- Network Alpha: 32
- Epochs: 17
- Noise Offset: 0.03
- Multires Noise Discount: 0.1
- Multires Noise Iterations: 10
- Repeat & Steps: 25 & 2000+
- Trigger Word:
cute-kawaii
Training
The training of the Canopus-Cute-Kawaii-Flux-LoRA involved over 70 high-resolution images. The training used specific prompts to generate kawaii-style images, including animated ice cream cones, bears, and watermelons. The training process is ongoing, and improvements are expected as more data is utilized.
Guide: Running Locally
To run the model locally, follow these steps:
-
Install Dependencies: Ensure you have PyTorch installed along with the necessary libraries for diffusion pipelines.
-
Import Libraries:
import torch from pipelines import DiffusionPipeline
-
Load Base Model:
base_model = "black-forest-labs/FLUX.1-dev" pipe = DiffusionPipeline.from_pretrained(base_model, torch_dtype=torch.bfloat16)
-
Load LoRA Weights:
lora_repo = "prithivMLmods/Canopus-Cute-Kawaii-Flux-LoRA" pipe.load_lora_weights(lora_repo)
-
Set Device:
device = torch.device("cuda") pipe.to(device)
-
Generate Images: Use the trigger word
cute-kawaii
to generate images.
Suggestion: Cloud GPUs
For optimal performance, consider using cloud GPU services such as AWS EC2 with GPU instances, Google Cloud's AI Platform, or Azure Machine Learning.
License
This model is released under the Creativeml-Openrail-M license, which allows for creative work and sharing with certain conditions. Please review the license terms for compliance.