nielsr HF Staff commited on
Commit
f05ac27
·
verified ·
1 Parent(s): e841e98

Improve model card: Add pipeline tag, description, GitHub link, and sample usage

Browse files

This PR significantly enhances the model card for the GAS model by:
- Adding the `pipeline_tag: unconditional-image-generation` to the metadata, enabling better discoverability on the Hugging Face Hub.
- Providing a comprehensive description of the model, summarizing its core contributions from the paper.
- Including a direct link to the GitHub repository for easy access to the code.
- Adding a "How to use" section with an official inference code snippet, making it easier for users to get started with the trained model.

Please review and merge this update.

Files changed (1) hide show
  1. README.md +29 -1
README.md CHANGED
@@ -1,11 +1,39 @@
1
  ---
2
- license: mit
3
  datasets:
4
  - bayes-group-diffusion/GAS-teachers
 
5
  tags:
6
  - arxiv:2510.17699
 
7
  ---
8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
  ## Citation
10
 
11
  ```bibtex
 
1
  ---
 
2
  datasets:
3
  - bayes-group-diffusion/GAS-teachers
4
+ license: mit
5
  tags:
6
  - arxiv:2510.17699
7
+ pipeline_tag: unconditional-image-generation
8
  ---
9
 
10
+ # GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver
11
+
12
+ This repository contains the implementation for **GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver**, a method presented in the paper [GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver](https://arxiv.org/abs/2510.17699).
13
+
14
+ The work introduces a novel approach to accelerate sampling in diffusion models without compromising generation quality. The **Generalized Solver (GS)** offers a simpler parameterization of the ODE sampler, and when combined with adversarial training, forms the **Generalized Adversarial Solver (GAS)**, which enhances detail fidelity and mitigates artifacts. This method aims to reduce the computational cost of diffusion model sampling from dozens to just a few function evaluations.
15
+
16
+ ![Teaser image](https://github.com/3145tttt/GAS/raw/main/docs/teaser_1920.jpg)
17
+
18
+ For detailed code, setup instructions, and examples, please refer to the official GitHub repository: [https://github.com/3145tttt/GAS](https://github.com/3145tttt/GAS)
19
+
20
+ ## How to use
21
+
22
+ To generate images from a trained **GS** checkpoint, you can use the `generate.py` script. Set the `--checkpoint_path` option to the path of your trained model checkpoint.
23
+
24
+ ```bash
25
+ # Generate 50000 images using 2 GPUs and a checkpoint from checkpoint_path
26
+ torchrun --standalone --nproc_per_node=2 generate.py \
27
+ --config=configs/edm/cifar10.yaml \
28
+ --outdir=data/teachers/cifar10 \
29
+ --seeds=50000-99999 \
30
+ --batch=1024 \
31
+ --steps=4 \
32
+ --checkpoint_path=checkpoint_path
33
+ ```
34
+
35
+ For a fair comparison and to avoid leakage of test seeds into the training dataset, we recommend using seeds 50000-99999 for all datasets except MS-COCO, which should use seeds 30000-59999.
36
+
37
  ## Citation
38
 
39
  ```bibtex