GGFModel
#7
by
zawminhtutx
- opened
README.md
CHANGED
|
@@ -13,7 +13,6 @@ Quantized GGUF versions of the [Z-Image Turbo](https://huggingface.co/Tongyi-MAI
|
|
| 13 |
| Model | Download |
|
| 14 |
|--------|--------------|
|
| 15 |
| Z-Image Turbo GGUF | [Download](https://huggingface.co/jayn7/Z-Image-Turbo-GGUF/tree/main) |
|
| 16 |
-
| Qwen3-4B (Text Encoder) | [unsloth/Qwen3-4B-GGUF](https://huggingface.co/unsloth/Qwen3-4B-GGUF)
|
| 17 |
|
| 18 |
### 📷 Example Comparison
|
| 19 |

|
|
@@ -30,69 +29,8 @@ Check out the original model card [Z-Image Turbo](https://huggingface.co/Tongyi-
|
|
| 30 |
|
| 31 |
The model can be used with:
|
| 32 |
|
| 33 |
-
- [**ComfyUI-GGUF**](https://github.com/city96/ComfyUI-GGUF) by **city96**
|
| 34 |
-
- [**Diffusers**](https://github.com/huggingface/diffusers)
|
| 35 |
|
| 36 |
-
#### Example Usage
|
| 37 |
-
|
| 38 |
-
<details>
|
| 39 |
-
<summary>Diffusers</summary>
|
| 40 |
-
|
| 41 |
-
```sh
|
| 42 |
-
pip install git+https://github.com/huggingface/diffusers
|
| 43 |
-
```
|
| 44 |
-
|
| 45 |
-
```py
|
| 46 |
-
from diffusers import ZImagePipeline, ZImageTransformer2DModel, GGUFQuantizationConfig
|
| 47 |
-
import torch
|
| 48 |
-
|
| 49 |
-
prompt = "Young Chinese woman in red Hanfu, intricate embroidery. Impeccable makeup, red floral forehead pattern. Elaborate high bun, golden phoenix headdress, red flowers, beads. Holds round folding fan with lady, trees, bird. Neon lightning-bolt lamp (⚡️), bright yellow glow, above extended left palm. Soft-lit outdoor night background, silhouetted tiered pagoda (西安大雁塔), blurred colorful distant lights."
|
| 50 |
-
height = 1024
|
| 51 |
-
width = 1024
|
| 52 |
-
seed = 42
|
| 53 |
-
|
| 54 |
-
#hf_path = "https://huggingface.co/jayn7/Z-Image-Turbo-GGUF/blob/main/z_image_turbo-Q3_K_M.gguf"
|
| 55 |
-
local_path = "path\to\local\model\z_image_turbo-Q3_K_M.gguf"
|
| 56 |
-
|
| 57 |
-
transformer = ZImageTransformer2DModel.from_single_file(
|
| 58 |
-
local_path,
|
| 59 |
-
quantization_config=GGUFQuantizationConfig(compute_dtype=torch.bfloat16),
|
| 60 |
-
dtype=torch.bfloat16,
|
| 61 |
-
)
|
| 62 |
-
|
| 63 |
-
pipeline = ZImagePipeline.from_pretrained(
|
| 64 |
-
"Tongyi-MAI/Z-Image-Turbo",
|
| 65 |
-
transformer=transformer,
|
| 66 |
-
dtype=torch.bfloat16,
|
| 67 |
-
).to("cuda")
|
| 68 |
-
|
| 69 |
-
# [Optional] Attention Backend
|
| 70 |
-
# Diffusers uses SDPA by default. Switch to Custom attention backend for better efficiency if supported:
|
| 71 |
-
#pipeline.transformer.set_attention_backend("_sage_qk_int8_pv_fp16_triton") # Enable Sage Attention
|
| 72 |
-
#pipeline.transformer.set_attention_backend("flash") # Enable Flash-Attention-2
|
| 73 |
-
#pipeline.transformer.set_attention_backend("_flash_3") # Enable Flash-Attention-3
|
| 74 |
-
|
| 75 |
-
# [Optional] Model Compilation
|
| 76 |
-
# Compiling the DiT model accelerates inference, but the first run will take longer to compile.
|
| 77 |
-
#pipeline.transformer.compile()
|
| 78 |
-
|
| 79 |
-
# [Optional] CPU Offloading
|
| 80 |
-
# Enable CPU offloading for memory-constrained devices.
|
| 81 |
-
#pipeline.enable_model_cpu_offload()
|
| 82 |
-
|
| 83 |
-
images = pipeline(
|
| 84 |
-
prompt=prompt,
|
| 85 |
-
num_inference_steps=9, # This actually results in 8 DiT forwards
|
| 86 |
-
guidance_scale=0.0, # Guidance should be 0 for the Turbo models
|
| 87 |
-
height=height,
|
| 88 |
-
width=width,
|
| 89 |
-
generator=torch.Generator("cuda").manual_seed(seed)
|
| 90 |
-
).images[0]
|
| 91 |
-
|
| 92 |
-
images.save("zimage.png")
|
| 93 |
-
```
|
| 94 |
-
|
| 95 |
-
</details>
|
| 96 |
|
| 97 |
|
| 98 |
### Credits
|
|
|
|
| 13 |
| Model | Download |
|
| 14 |
|--------|--------------|
|
| 15 |
| Z-Image Turbo GGUF | [Download](https://huggingface.co/jayn7/Z-Image-Turbo-GGUF/tree/main) |
|
|
|
|
| 16 |
|
| 17 |
### 📷 Example Comparison
|
| 18 |

|
|
|
|
| 29 |
|
| 30 |
The model can be used with:
|
| 31 |
|
| 32 |
+
- [**ComfyUI-GGUF**](https://github.com/city96/ComfyUI-GGUF) by **city96**
|
|
|
|
| 33 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 34 |
|
| 35 |
|
| 36 |
### Credits
|