Improve model card: Add pipeline tag, abstract, and GitHub link
Browse filesThis PR enhances the model card by:
- Adding the `pipeline_tag: text-to-image` to improve discoverability on the Hugging Face Hub, as the model is related to image generation tasks (https://huggingface.co/models?pipeline_tag=text-to-image).
- Including the paper's abstract to provide a comprehensive overview of the model and its contributions.
- Providing a direct link to the official GitHub repository for easy access to the code and training instructions.
The `library_name` has been intentionally omitted as there is no explicit evidence in the provided materials (GitHub README, paper abstract) of the model's compatibility with a specific Hugging Face library.
Please review and merge if these improvements align with our goals.
README.md
CHANGED
|
@@ -1,21 +1,31 @@
|
|
| 1 |
---
|
| 2 |
-
license: mit
|
| 3 |
datasets:
|
| 4 |
- bayes-group-diffusion/GAS-teachers
|
|
|
|
| 5 |
tags:
|
| 6 |
- arxiv:2510.17699
|
|
|
|
| 7 |
---
|
| 8 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 9 |
## Citation
|
| 10 |
|
| 11 |
```bibtex
|
| 12 |
@misc{oganov2025gasimprovingdiscretizationdiffusion,
|
| 13 |
-
title={GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver},
|
| 14 |
author={Aleksandr Oganov and Ilya Bykov and Eva Neudachina and Mishan Aliev and Alexander Tolmachev and Alexander Sidorov and Aleksandr Zuev and Andrey Okhotin and Denis Rakitin and Aibek Alanov},
|
| 15 |
year={2025},
|
| 16 |
eprint={2510.17699},
|
| 17 |
archivePrefix={arXiv},
|
| 18 |
primaryClass={cs.CV},
|
| 19 |
-
url={https://arxiv.org/abs/2510.17699},
|
| 20 |
}
|
| 21 |
```
|
|
|
|
| 1 |
---
|
|
|
|
| 2 |
datasets:
|
| 3 |
- bayes-group-diffusion/GAS-teachers
|
| 4 |
+
license: mit
|
| 5 |
tags:
|
| 6 |
- arxiv:2510.17699
|
| 7 |
+
pipeline_tag: text-to-image
|
| 8 |
---
|
| 9 |
|
| 10 |
+
# GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver
|
| 11 |
+
|
| 12 |
+
This repository contains the implementation for the paper [GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver](https://huggingface.co/papers/2510.17699).
|
| 13 |
+
|
| 14 |
+
**Abstract:**
|
| 15 |
+
While diffusion models achieve state-of-the-art generation quality, they still suffer from computationally expensive sampling. Recent works address this issue with gradient-based optimization methods that distill a few-step ODE diffusion solver from the full sampling process, reducing the number of function evaluations from dozens to just a few. However, these approaches often rely on intricate training techniques and do not explicitly focus on preserving fine-grained details. In this paper, we introduce the Generalized Solver: a simple parameterization of the ODE sampler that does not require additional training tricks and improves quality over existing approaches. We further combine the original distillation loss with adversarial training, which mitigates artifacts and enhances detail fidelity. We call the resulting method the Generalized Adversarial Solver and demonstrate its superior performance compared to existing solver training methods under similar resource constraints.
|
| 16 |
+
|
| 17 |
+
Code: https://github.com/3145tttt/GAS
|
| 18 |
+
|
| 19 |
## Citation
|
| 20 |
|
| 21 |
```bibtex
|
| 22 |
@misc{oganov2025gasimprovingdiscretizationdiffusion,
|
| 23 |
+
title={GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver},
|
| 24 |
author={Aleksandr Oganov and Ilya Bykov and Eva Neudachina and Mishan Aliev and Alexander Tolmachev and Alexander Sidorov and Aleksandr Zuev and Andrey Okhotin and Denis Rakitin and Aibek Alanov},
|
| 25 |
year={2025},
|
| 26 |
eprint={2510.17699},
|
| 27 |
archivePrefix={arXiv},
|
| 28 |
primaryClass={cs.CV},
|
| 29 |
+
url={https://arxiv.org/abs/2510.17699},
|
| 30 |
}
|
| 31 |
```
|