--- license: apache-2.0 pipeline_tag: unconditional-image-generation library_name: diffusers tags: - flow-matching - consistency-models - latent-diffusion - imagenet --- # FACM: Flow-Anchored Consistency Models

🔥 FACM outperforms 2×250-step Lightning-DiT on ImageNet 256 with only 2 steps

FID=1.70 (1-step)      FID=1.32 (2-step)

📄 This is the official implementation of the paper: [Flow-Anchored Consistency Models](https://huggingface.co/papers/2507.03738). Code: [https://github.com/ali-vilab/FACM](https://github.com/ali-vilab/FACM) ## Abstract Continuous-time Consistency Models (CMs) promise efficient few-step generation but face significant challenges with training instability. We argue this instability stems from a fundamental conflict: by training a network to learn only a shortcut across a probability flow, the model loses its grasp on the instantaneous velocity field that defines the flow. Our solution is to explicitly anchor the model in the underlying flow during training. We introduce the Flow-Anchored Consistency Model (FACM), a simple but effective training strategy that uses a Flow Matching (FM) task as an anchor for the primary CM shortcut objective. This Flow-Anchoring approach requires no architectural modifications and is broadly compatible with standard model architectures. By distilling a pre-trained LightningDiT model, our method achieves a state-of-the-art FID of 1.32 with two steps (NFE=2) and 1.76 with just one step (NFE=1) on ImageNet 256x256, significantly outperforming previous methods. This provides a general and effective recipe for building high-performance, few-step generative models. Our code and pretrained models: this https URL . ## ImageNet 256 Performance on 8 × A100 GPUs | Model | Steps | FID | IS | Epochs-Pretrain | Epochs-Distill | Download | |:-----:|:-----:|:---:|:--:|:---------------:|:--------------:|:--------:| | FACM | 2-step | 1.32 | 292 | 800 | 100 | [100ep-stg2.pt](https://huggingface.co/Peterande/FACM/blob/main/100ep-stg2.pt) | | FACM | 1-step | 1.76 | 290 | 800 | 250 | [250ep-stg2.pt](https://huggingface.co/Peterande/FACM/blob/main/250ep-stg2.pt) | | FACM | 1-step | 1.70 | 295 | 800 | 400 | [400ep-stg2.pt](https://huggingface.co/Peterande/FACM/blob/main/400ep-stg2.pt) | ## Quick Start ### Prerequisites: Download the required model weights and statistics files from [HuggingFace](https://huggingface.co/Peterande/FACM/tree/main) or [ModelScope](https://modelscope.cn/models/Peterande/FACM/files) to `./cache` Including: `fid-50k-256.npz, latents_stats.pt, vavae-imagenet256-f16d32-dinov2.pt` ### Data Preparation ```bash export DATA_PATH="/path/to/imagenet" export OUTPUT_PATH="/path/to/latents" bash scripts/extract.sh ``` *Note: You can also download pre-extracted ImageNet latents following [Lightning-DiT](https://github.com/hustvl/LightningDiT/blob/main/docs/tutorial.md). ### Inference Download pretrained FACM model checkpoint [100ep-stg2.pt](https://huggingface.co/Peterande/FACM/blob/main/100ep-stg2.pt) or [400ep-stg2.pt](https://huggingface.co/Peterande/FACM/blob/main/400ep-stg2.pt) to `./cache` ```bash bash scripts/test.sh --ckpt-path cache/100ep-stg2.pt --sampling-steps 2 ``` ```bash bash scripts/test.sh --ckpt-path cache/400ep-stg2.pt --sampling-steps 1 ``` ### Training Download pretrained FM model checkpoint [800ep-stg1.pt](https://huggingface.co/Peterande/FACM/blob/main/800ep-stg1.pt) to `./cache` ```bash export DATA_PATH="/path/to/latents" bash scripts/train.sh ``` ### Pretraining (Optional) Replace [configs/lightningdit_xl_vavae_f16d32.yaml](https://github.com/hustvl/LightningDiT/blob/main/configs/lightningdit_xl_vavae_f16d32.yaml) and [transport/transport.py](https://github.com/hustvl/LightningDiT/blob/main/transport/transport.py) of [Lightning-DiT](https://github.com/hustvl/LightningDiT) with our `ldit/lightningdit_xl_vavae_f16d32.yaml` and `ldit/transport.py`, then follow the [instructions](https://github.com/hustvl/LightningDiT/blob/main/docs/tutorial.md). ## Reproductions
reproductions We include reproductions of [MeanFlow](https://arxiv.org/abs/2505.13447) and [sCM](https://arxiv.org/abs/2410.11081). Switch methods by changing the loss function in `train.py` line 81: ```python facm_loss = FACMLoss() # FACM (default) facm_loss = MeanFlowLoss() # MeanFlow facm_loss = sCMLoss() # sCM ```
## Citation If you use `FACM` or its methods in your work, please cite the following BibTeX entries:
bibtex ```latex @misc{peng2025facm, title={Flow-Anchored Consistency Models}, author={Yansong Peng and Kai Zhu and Yu Liu and Pingyu Wu and Hebei Li and Xiaoyan Sun and Feng Wu}, year={2025}, eprint={2507.03738}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
## Acknowledgements The model architecture part is based on the [Lightning-DiT](https://github.com/hustvl/LightningDiT) repository. ✨ Feel free to contribute and reach out if you have any questions! ✨