Skip to content

Commit

Permalink
update readme with flash_attn and diffusers as options. (#412)
Browse files Browse the repository at this point in the history
  • Loading branch information
feifeibear authored Dec 26, 2024
1 parent 81700db commit 4d6a038
Show file tree
Hide file tree
Showing 2 changed files with 18 additions and 8 deletions.
20 changes: 14 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -203,21 +203,29 @@ Currently, if you need the parallel version of ComfyUI, please fill in this [app

### 1. Install from pip

We set diffusers as an optional installation requirement.
First, if you only use the USP interface, you don't need to install diffusers. Second, different models have different requirements for diffusers - for example, the latest models may need to be installed from the diffusers main branch.
We set `diffusers` and `flash_attn` as two optional installation requirements.

About `diffusers` version:
- If you only use the USP interface, `diffusers` is not required. Models are typically released as `nn.Module`
first, before being integrated into diffusers. xDiT sometimes is applied as an USP plugin to existing projects.
- Different models may require different diffusers versions. Model implementations can vary between diffusers versions (e.g., Flux), which affects parallel processing. When encountering model execution errors, you may need to try several recent diffusers versions.
- While we specify a diffusers version in `setup.py`, newer models may require later versions or even installation from main branch.

About `flash_attn` version:
- Without `flash_attn` installed, xDiT falls back to a PyTorch implementation of ring attention, which helps NPU users with compatibility
- However, not using `flash_attn` on GPUs may result in suboptimal performance. For best GPU performance, we recommend installing `flash_attn`.

```
pip install xfuser
# Or optionally, with diffusers
pip install "xfuser[diffusers]"
pip install xfuser # Basic installation
pip install "xfuser[diffusers,flash-attn]" # With both diffusers and flash attention
```

### 2. Install from source

```
pip install -e .
# Or optionally, with diffusers
pip install -e ".[diffusers]"
pip install -e ".[diffusers,flash-attn]"
```

Note that we use two self-maintained packages:
Expand Down
6 changes: 4 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,10 @@ def get_cuda_version():
],
extras_require={
"diffusers": [
"diffusers>=0.31.0", # NOTE: diffusers>=0.32.0.dev is necessary for CogVideoX and Flux
"flash_attn>=2.6.3",
"diffusers==0.31.0", # NOTE: diffusers>=0.32.0.dev is necessary for CogVideoX and Flux
],
"flash-attn": [
"flash-attn>=2.6.0",
]
},
url="https://github.com/xdit-project/xDiT.",
Expand Down

0 comments on commit 4d6a038

Please sign in to comment.