Tags
6 pages
DiT
DiTFastAttn Attention Compression for Diffusion Transformer Models
VAR AR VAE GAN DiT Depooling
Patch Parallelism
Cfg Parallelism
PipeFusion
1
2