Skip to content

Commit 5754422

Browse files
committed
huggingface/xpu: update transformers test instruction
Signed-off-by: Dmitry Rogozhkin <dmitry.v.rogozhkin@intel.com>
1 parent 0120c7b commit 5754422

File tree

2 files changed

+27
-3
lines changed

2 files changed

+27
-3
lines changed

huggingface/accelerate-with-pytorch-xpu.md

+13-1
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,25 @@ Haggingface Accelerate supports PyTorch XPU backend starting from version `0.32.
66

77
# Testing
88

9-
Accelerate library can be formally tested for XPU backend by the project included tests on the system with supported Intel GPU graphics:
9+
Accelerate library can be formally tested for XPU backend by the project included tests on the system with supported Intel GPU graphics. To run tests:
10+
11+
* Install Accelerate library and test dependencies:
12+
13+
```
14+
pip install -e .
15+
pip install -e ".[dev]"
16+
```
17+
18+
* Execute tests:
1019

1120
```
1221
cd /path/to/accelerate/clone/copy
1322
python3 -m pytest tests
1423
```
1524

25+
> [!NOTE]
26+
> Accelerate depends on PyTorch `torch` package which gets installed during installation of `accelerate`. If you need to test against different version of PyTorch – reinstall it after installing Accelerate.
27+
1628
# Using with PyTorch 2.5
1729

1830
| Accelerate | Failed | Passed | Skipped |

huggingface/transformers-with-pytorch-xpu.md

+14-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,16 @@ Haggingface Transformers support PyTorch XPU backend starting from version `4.42
66

77
# Testing
88

9-
Transformers library can be formally tested for XPU backend by the project included tests on the system with supported Intel GPU graphics. To run tests first create `spec.py` file with the following contents:
9+
Transformers library can be formally tested for XPU backend by the project included tests on the system with supported Intel GPU graphics. To run tests:
10+
11+
* Install Transformers library and test dependencies:
12+
13+
```
14+
pip install -e .
15+
pip install -e ".[dev]"
16+
```
17+
18+
* Create `spec.py` file with the following contents:
1019

1120
```
1221
import torch
@@ -18,13 +27,16 @@ EMPTY_CACHE_FN = torch.xpu.empty_cache
1827
DEVICE_COUNT_FN = torch.xpu.device_count
1928
```
2029

21-
After that tests can be executed as follows:
30+
* After that tests can be executed as follows:
2231

2332
```
2433
cd /path/to/transformers/clone/copy
2534
TRANSFORMERS_TEST_DEVICE_SPEC=spec.py python3 -m pytest --pspec tests/
2635
```
2736

37+
> [!NOTE]
38+
> Some Transformers tests depend on `torchvision` built with `libPNG`. Install on Ubuntu with `apt-get install libpng-dev`, then rebuild `torchvision` and rerun affected Transformers tests.
39+
2840
# Using with PyTorch 2.5
2941

3042
XPU backend in PyTorch 2.5 greatly improved coverage of supported aten eager mode operators with just few missing for the executed Transformer tests. They are much easier to indentify now since [intel/torch-xpu-ops#318] is available in PyTorch 2.5 and adding `PYTORCH_ENABLE_XPU_FALLBACK=1` works fluently.

0 commit comments

Comments
 (0)