FAST Flux GGUF for low VRAM GPUs with Highest Quality. Installation, Tips & Performance Comparison.
We install the new GGUF node on ComfyUI locally for NVIDIA or AMD GPUs.
The image generation examples show both the great quality as well as the detailed performance, followed by tips & tricks including Flux.1 DEV and Flux.1 SCHNELL.
Videos:
Flux Installation on ComfyUI: https://youtu.be/52YAQZ-1nOA
ComfyUI with ZLUDA on Windows: https://youtu.be/X4V3ppyb3zs
ComfyUI with ROCm on Linux: https://youtu.be/QEbI6v2oPvQ
Links:
https://github.com/comfyanonymous
https://github.com/comfyanonymous/ComfyUI_examples/tree/master/flux
https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4
The GGUF Models:
https://github.com/city96/ComfyUI-GGUF
https://huggingface.co/city96/FLUX.1-schnell-gguf/tree/main
https://huggingface.co/city96/FLUX.1-dev-gguf/tree/main
About Quantization:
https://github.com/ggerganov/llama.cpp/tree/master/examples/quantize#quantizationofcharge
PLEASE CHECK THE PINNED COMMENT FOR UPDATES !
Chapters:
0:00 About Flux and GGUF
1:02 GGUF Installation
2:54 ZLUDA Update
3:46 Adding GGUF Loader
4:33 GGUF Models and Test
6:43 Example Generation
7:45 Result Comparison
9:47 Performance Details
13:16 Key findings
#comfyui #flux #gguf #stablediffusion