Skip to content
This repository has been archived by the owner on Feb 18, 2025. It is now read-only.

Commit

Permalink
Update vllm_ascend/attention.py
Browse files Browse the repository at this point in the history
Co-authored-by: Mengqing Cao <cmq0113@163.com>
  • Loading branch information
wangxiyuan and MengqingCao authored Jan 13, 2025
1 parent dc63477 commit 98b01b8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm_ascend/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
PagedAttentionMetadata)

if TYPE_CHECKING:
from vllm_ascend.worker import ModelInputForNPUBuilder
from vllm_ascend.model_runner import ModelInputForNPUBuilder

SHARE_MASK_TRIL_PREFIX_CACHE = None
SHARE_MASK_TRIL = None
Expand Down

0 comments on commit 98b01b8

Please sign in to comment.