gguf quantized and fp8 scaled aura
setup (once)
- drag aura_flow_0.3_q4_0.gguf [3.95GB] to > ./ComfyUI/models/diffusion_models
- drag aura_flow_0.3_fp8_scaled.safetensors [9.66GB] to > ./ComfyUI/models/checkpoints
- drag aura_vae.safetensors [167MB] to > ./ComfyUI/models/vae
run it straight (no installation needed way)
- run the .bat file in the main directory (assuming you are using the full pack below)
- drag the workflow json file (below) to > your browser
workflows
- drag any workflow json file to the activated browser; or
- drag any generated picture (which contains the workflow metadata) to the activated browser
- workflow for safetensors (fp8 scaled version above is recommended)
- workflow for gguf (opt to use the aura vae; but for the text encoder, since the unique format of that separate clip was not supported by comfyui yet, for the time being, use the one embedded inside fp8 scaled safetensors would be a better choice, it won't affect the speed, as the model will be loaded from gguf instead)
references
- base model from fal
- comfyui from comfyanonymous
- gguf-node beta
- Downloads last month
- 95
Model tree for calcuis/aura
Base model
fal/AuraFlow-v0.3