Skip to content

Commit 23a7cbc

Browse files
[CI/Build] Fixed the xla nightly issue report in vllm-project#12451 (vllm-project#12453)
1 parent 426a5c3 commit 23a7cbc

File tree

1 file changed

+8
-11
lines changed

1 file changed

+8
-11
lines changed

requirements-tpu.txt

+8-11
Original file line numberDiff line numberDiff line change
@@ -10,17 +10,14 @@ wheel
1010
jinja2
1111
ray[default]
1212

13-
# Install torch, torch_xla
13+
# Install torch_xla
14+
--pre
15+
--extra-index-url https://download.pytorch.org/whl/nightly/cpu
16+
--find-links https://storage.googleapis.com/libtpu-wheels/index.html
1417
--find-links https://storage.googleapis.com/libtpu-releases/index.html
1518
--find-links https://storage.googleapis.com/jax-releases/jax_nightly_releases.html
1619
--find-links https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html
17-
# Note: This torch whl can be slightly different from the official torch nightly whl
18-
# since they are not built on the same commit (but on the same day). This difference may cause C++ undefined symbol issue
19-
# if some change between the 2 commits introduce some C++ API change.
20-
# Here we install the exact torch whl from which torch_xla is built from, to avoid potential C++ undefined symbol issue.
21-
torch @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch-2.7.0.dev20250124-cp39-cp39-linux_x86_64.whl ; python_version == "3.9"
22-
torch @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch-2.7.0.dev20250124-cp310-cp310-linux_x86_64.whl ; python_version == "3.10"
23-
torch @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch-2.7.0.dev20250124-cp311-cp311-linux_x86_64.whl ; python_version == "3.11"
24-
torch_xla[pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev20250124-cp39-cp39-linux_x86_64.whl ; python_version == "3.9"
25-
torch_xla[pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev20250124-cp310-cp310-linux_x86_64.whl ; python_version == "3.10"
26-
torch_xla[pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev20250124-cp311-cp311-linux_x86_64.whl ; python_version == "3.11"
20+
torch==2.6.0.dev20241216+cpu
21+
torch_xla[tpu, pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev20250124-cp39-cp39-linux_x86_64.whl ; python_version == "3.9"
22+
torch_xla[tpu, pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev20250124-cp310-cp310-linux_x86_64.whl ; python_version == "3.10"
23+
torch_xla[tpu, pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.7.0.dev20250124-cp311-cp311-linux_x86_64.whl ; python_version == "3.11"

0 commit comments

Comments
 (0)