-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add XPU support for IPEXModel.from_pretrained
#704
Conversation
IPEXModel.from_pretrained
IPEXModel.from_pretrained
@echarlaix pls have a review. Thx! |
Hi @IlyasMoutawwakil could you help review this PR? Thanks a lot! |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
To fix the code style test you can do the following :
|
no device_map Co-authored-by: Ella Charlaix <80481427+echarlaix@users.noreply.github.com>
Hi @echarlaix, there are some CI tests failing. Do you know what I can do to fix them? Thx! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for the addition @faaany !
Co-authored-by: Ella Charlaix <80481427+echarlaix@users.noreply.github.com>
What does this PR do?
This PR adds XPU support for loading a Torchscript model using
IPEXModel.from_pretrained
. Below is a test example:Please note that
faaany/bert-base-uncased-float32-traced
is just a test model, which is traced from "google-bert/bert-base-uncased". Since I didn't load any tokenizer config, so I passmodel_id_tokenizer
toAutoTokenizer
in the example above.