Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add sdpa for phi3 openvino model #705

Merged
merged 5 commits into from
May 15, 2024
Merged

Conversation

eaidova
Copy link
Collaborator

@eaidova eaidova commented May 13, 2024

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@eaidova
Copy link
Collaborator Author

eaidova commented May 14, 2024

@echarlaix @AlexKoff88 could you please take a look?

use_cache=use_cache,
)

from transformers.models.llama.modeling_llama import apply_rotary_pos_emb, repeat_kv
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

even if currently equivalent shouldn't be replace it with

Suggested change
from transformers.models.llama.modeling_llama import apply_rotary_pos_emb, repeat_kv
from transformers.models.phi3.modeling_phi3 import apply_rotary_pos_emb, repeat_kv

to avoid any issue resulting from potential refactorization

Copy link
Collaborator Author

@eaidova eaidova May 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe it will breaks with current release, because phi3 code in transformers is not released yet on pypi... Can we add some TODO to change this in future (currently model is loaded using trust_remote_code using stable transformers release)?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes works for me, thanks @eaidova !

@echarlaix echarlaix merged commit 8c2b787 into huggingface:main May 15, 2024
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants