Skip to content

Commit

Permalink
bitsandbytes
Browse files Browse the repository at this point in the history
  • Loading branch information
Borda committed Jan 8, 2025
1 parent 2d8eb16 commit 3a06681
Show file tree
Hide file tree
Showing 3 changed files with 22 additions and 4 deletions.
11 changes: 11 additions & 0 deletions .azure/gpu-tests-fabric.yml
Original file line number Diff line number Diff line change
Expand Up @@ -123,6 +123,17 @@ jobs:
python requirements/pytorch/check-avail-extras.py
displayName: "Env details"
- bash: |
# get pytorch version
PYTORCH_VERSION=$(python -c "import torch; print(torch.__version__.split('+')[0])")
# FixMe: uninstall bitsandbytes for pytorch 2.6 as it is not compatible with `triton.ops`
if [[ "${PYTORCH_VERSION}" == "2.6.0" ]]; then
pip uninstall -y bitsandbytes
else
python -c "import bitsandbytes"
fi
displayName: "Handle bitsandbytes"
- bash: python -m pytest lightning_fabric
workingDirectory: src
# without succeeded this could run even if the job has already failed
Expand Down
11 changes: 11 additions & 0 deletions .azure/gpu-tests-pytorch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,17 @@ jobs:
python requirements/pytorch/check-avail-extras.py
displayName: "Env details"
- bash: |
# get pytorch version
PYTORCH_VERSION=$(python -c "import torch; print(torch.__version__.split('+')[0])")
# FixMe: uninstall bitsandbytes for pytorch 2.6 as it is not compatible with `triton.ops`
if [[ "${PYTORCH_VERSION}" == "2.6.0" ]]; then
pip uninstall -y bitsandbytes
else
python -c "import bitsandbytes"
fi
displayName: "Handle bitsandbytes"
- bash: python -m pytest pytorch_lightning
workingDirectory: src
# without succeeded this could run even if the job has already failed
Expand Down
4 changes: 0 additions & 4 deletions requirements/pytorch/check-avail-extras.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,3 @@
import matplotlib # noqa: F401
import omegaconf # noqa: F401
import rich # noqa: F401
import torch

if torch.cuda.is_available():
import bitsandbytes # noqa: F401

0 comments on commit 3a06681

Please sign in to comment.