Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update InferenceClient docstring to reflect that token=False is no longer accepted #2853

Merged
merged 4 commits into from
Feb 27, 2025

Conversation

abidlabs
Copy link
Member

@abidlabs abidlabs commented Feb 12, 2025

Just wondering -- is there still any way not to pass in the local hf token?

@abidlabs abidlabs requested a review from Wauplin February 12, 2025 21:59
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@hanouticelina
Copy link
Contributor

@abidlabs in practice, HF Inference API still supports passing token=False. We removed that to simplify the logic by avoiding the handle of boolean token for external providers.

@julien-c
Copy link
Member

but the HF Inference API requires a token, right? (or if not, it will very very soon)

@Wauplin
Copy link
Contributor

Wauplin commented Feb 13, 2025

I noticed our internal logic for token=False doesn't really work at the moment (local token is still passed). I'll fix this in a follow-up PR.

And yes with the new inference providers, even HF Inference requires a token now.

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey! Sorry, took me some time to get back to this PR. I've pushed a new commit so that passing token=False raises an error. This is a breaking change of a broken behavior so ok to have it IMO. Previously, passing token=False resulted on the token being sent, which is absolutely the opposite of user's intention. As a reminder, authentication is now required for all HF Inference calls so it just doesn't make sense to pass token=False.

I also took the opportunity to remove some tests that were failing due to authentication issues. They were tests around get_model_status and list_deployed_models which are deprecated anyway.

@Wauplin
Copy link
Contributor

Wauplin commented Feb 26, 2025

@hanouticelina @julien-c mind re-reviewing it?

Copy link
Contributor

@hanouticelina hanouticelina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks!

@Wauplin Wauplin merged commit f1d0bf8 into main Feb 27, 2025
19 checks passed
@Wauplin Wauplin deleted the abidlabs-patch-1 branch February 27, 2025 09:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants