Open
Description
Describe the bug
When following the https://github.com/huggingface/diffusers/blob/main/examples/advanced_diffusion_training/README.md
I also had to install / do those things to make it work:
!pip install wandb prodigyopt datasets
!pip install --upgrade peft
- login to huggingface
- setup and login to wandb
Those things are not mentioned in the Readme / not updated in the requirements.txt which makes it hard to get started
Reproduction
Execute README preparation things on a new machine
cd diffusers
pip install -e .
cd examples/advanced_diffusion_training
pip install -r requirements.txt
accelerate config default
download dataset
from huggingface_hub import snapshot_download
local_dir = "./data/3d_icon"
snapshot_download(
"LinoyTsaban/3d_icon",
local_dir=local_dir, repo_type="dataset",
ignore_patterns=".gitattributes",
)
run script
MODEL_NAME="stabilityai/stable-diffusion-xl-base-1.0"
DATASET_NAME="./data/3d_icon"
OUTPUT_DIR="3d-icon-SDXL-LoRA"
VAE_PATH="madebyollin/sdxl-vae-fp16-fix"
!accelerate train_dreambooth_lora_sdxl_advanced.py \
--pretrained_model_name_or_path="$MODEL_NAME" \
--pretrained_vae_model_name_or_path="$VAE_PATH" \
--dataset_name="$DATASET_NAME" \
--instance_prompt="3d icon in the style of TOK" \
--validation_prompt="a TOK icon of an astronaut riding a horse, in the style of TOK" \
--output_dir="$OUTPUT_DIR" \
--caption_column="prompt" \
--mixed_precision="fp16" \
--resolution=1024 \
--train_batch_size=3 \
--repeats=1 \
--report_to="wandb"\
--gradient_accumulation_steps=1 \
--gradient_checkpointing \
--learning_rate=1.0 \
--text_encoder_lr=1.0 \
--optimizer="prodigy"\
--train_text_encoder_ti\
--train_text_encoder_ti_frac=0.5\
--snr_gamma=5.0 \
--lr_scheduler="constant" \
--lr_warmup_steps=0 \
--rank=8 \
--max_train_steps=1000 \
--checkpointing_steps=2000 \
--seed="0" \
--push_to_hub
Logs
No response
System Info
- 🤗 Diffusers version: 0.32.0.dev0
- Platform: Linux-6.8.0-49-generic-x86_64-with-glibc2.39
- Running on Google Colab?: No
- Python version: 3.11.10
- PyTorch version (GPU?): 2.5.1+cu124 (True)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Huggingface_hub version: 0.26.3
- Transformers version: 4.46.3
- Accelerate version: 1.1.1
- PEFT version: 0.13.2
- Bitsandbytes version: not installed
- Safetensors version: 0.4.5
- xFormers version: not installed
- Accelerator: NVIDIA RTX 5000 Ada Generation, 32760 MiB
- Using GPU in script?: Yes
- Using distributed or parallel set-up in script?: No