Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

upgrade transformers to 4.49 for patching models #1196

Merged
merged 3 commits into from
Mar 17, 2025

Conversation

jiqing-feng
Copy link
Collaborator

@jiqing-feng jiqing-feng commented Mar 12, 2025

Hi @echarlaix @IlyasMoutawwakil . I updated the transformers version to 4.49.0 in ipex backend. Please review it, thanks!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@jiqing-feng jiqing-feng marked this pull request as ready for review March 12, 2025 09:23
Signed-off-by: jiqing-feng <jiqing.feng@intel.com>
Signed-off-by: jiqing-feng <jiqing.feng@intel.com>
@jiqing-feng
Copy link
Collaborator Author

Hi @echarlaix @IlyasMoutawwakil . Please review this PR, thanks!

@jiqing-feng jiqing-feng mentioned this pull request Mar 14, 2025
**kwargs,
) -> None:
super().__init__()
self.max_batch_size = max_batch_size
default_device = torch.device("xpu") if ipex._C._has_xpu() else torch.device("cpu")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this seems more like a global variable, also if I understand correctly, if xpu device is available and ipex has xpu, it's okay, but what if xpu device is available and current ipex doesn't have xpu support, maybe warning the user and then proceding with cpu device makes sense.

setup.py Outdated
@@ -67,7 +67,7 @@
"nncf": ["nncf>=2.14.0"],
"openvino": ["nncf>=2.14.0", "openvino>=2024.5.0", "openvino-tokenizers>=2024.5.0"],
"neural-compressor": ["neural-compressor[pt]>3.0", "accelerate", "transformers<4.46"],
"ipex": ["intel-extension-for-pytorch>=2.4", "transformers>4.46,<4.48", "accelerate"],
"ipex": ["intel-extension-for-pytorch>=2.4", "transformers>4.48,<4.50", "accelerate"],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we are only testing ipex 2.6.0, shouldn't we test the minimum version as well ? if it still works ofc, otherwise we should update the minimum version.

Copy link
Member

@IlyasMoutawwakil IlyasMoutawwakil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall LGTM, just a couple nits.

@jiqing-feng
Copy link
Collaborator Author

Hi @IlyasMoutawwakil . I have fixed your comments!

@IlyasMoutawwakil IlyasMoutawwakil merged commit 81c1104 into huggingface:main Mar 17, 2025
14 of 16 checks passed
Signed-off-by: jiqing-feng <jiqing.feng@intel.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants