Wan1.3b_fp8 / README.md
codemichaeld's picture
Upload README.md with huggingface_hub
829d03c verified
metadata
library_name: diffusers
tags:
  - fp8
  - safetensors
  - delta-compensation
  - diffusion
  - converted-by-gradio

FP8 Model with Delta Compensation

  • Source: https://huggingface.co/Kijai/WanVideo_comfy
  • File: Wan2_1-T2V-1_3B_fp32.safetensors
  • FP8 Format: E5M2
  • Delta File: Wan2_1-T2V-1_3B_fp32-fp8-delta.safetensors

Usage (Inference)

To restore near-original precision:

import torch
from safetensors.torch import load_file

fp8_state = load_file("Wan2_1-T2V-1_3B_fp32-fp8-e5m2.safetensors")
delta_state = load_file("Wan2_1-T2V-1_3B_fp32-fp8-delta.safetensors")

restored_state = {}
for key in fp8_state:
    if f"delta.{key}" in delta_state:
        fp8_weight = fp8_state[key].to(torch.float32)
        delta = delta_state[f"delta.{key}"]
        restored_state[key] = fp8_weight + delta
    else:
        restored_state[key] = fp8_state[key].to(torch.float32)

Requires PyTorch ≥ 2.1 for FP8 support.