You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
f"The task could not be automatically inferred. Please provide the argument --task with the relevant task from {', '.join(TasksManager.get_all_tasks())}. Detailed error: {e}"
f"Hugging Face Hub is not reachable and we cannot infer the task from a cached model. Make sure you are not offline, or otherwise please specify the `task` (or `--task` in command-line) argument ({', '.join(TasksManager.get_all_tasks())})."
Infers the task from the model repo, model instance, or model class.
@@ -1882,7 +1887,9 @@ def infer_task_from_model(
1882
1887
token (`Optional[Union[bool,str]]`, defaults to `None`):
1883
1888
The token to use as HTTP bearer authorization for remote files. If `True`, will use the token generated
1884
1889
when running `huggingface-cli login` (stored in `huggingface_hub.constants.HF_TOKEN_PATH`).
1885
-
1890
+
library_name (`Optional[str]`, defaults to `None`):
1891
+
The library name of the model. Can be any of "transformers", "timm", "diffusers", "sentence_transformers". See `TasksManager.infer_library_from_model` for the priority should
1892
+
none be provided.
1886
1893
Returns:
1887
1894
`str`: The task name automatically detected from the HF hub repo, model instance, or model class.
Keyword arguments to pass to the model `.from_pretrained()` method.
2181
+
library_name (`Optional[str]`, defaults to `None`):
2182
+
The library name of the model. Can be any of "transformers", "timm", "diffusers", "sentence_transformers". See `TasksManager.infer_library_from_model` for the priority should
0 commit comments