Skip to content

Commit

Permalink
Merge branch 'main' into feat/sorting
Browse files Browse the repository at this point in the history
  • Loading branch information
AlpinDale authored Feb 28, 2025
2 parents 99da73f + 87c3eaf commit 1a023f3
Show file tree
Hide file tree
Showing 161 changed files with 62,181 additions and 51,736 deletions.
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug-report.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ body:
id: logs
attributes:
label: Logs
description: "Please include the Python logs if you can."
description: "Please include the Python logs if you can. If possible, run the code with `HF_DEBUG=1` as environment variable."
render: shell
- type: textarea
id: system-info
Expand Down
54 changes: 42 additions & 12 deletions .github/workflows/python-prerelease.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,48 +6,78 @@ on:
- "v*.rc*"

jobs:
test_on_transformers:
trigger_rc_testing:
runs-on: ubuntu-latest

strategy:
fail-fast: false
matrix:
target-repo: ["transformers", "datasets", "diffusers"]

steps:
- name: Extract version from tag
id: get-version
run: echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT

- name: Checkout transformers
- name: Checkout target repo
uses: actions/checkout@v4
with:
repository: huggingface/transformers
path: transformers
repository: huggingface/${{ matrix.target-repo }}
path: ${{ matrix.target-repo }}
token: ${{ secrets.HUGGINGFACE_HUB_AUTOMATIC_RC_TESTING }}

- name: Configure Git
run: |
cd transformers
cd ${{ matrix.target-repo }}
git config user.name "Hugging Face Bot (RC Testing)"
git config user.email "bot@huggingface.co"
- name: Wait for prerelease to be out on PyPI
run: |
VERSION=${{ steps.get-version.outputs.VERSION }}
echo "Waiting for huggingface-hub==${VERSION} to be available on PyPI"
while ! pip install huggingface-hub==${VERSION}; do
echo "huggingface-hub==${VERSION} not available yet, retrying in 15s"
sleep 15
done
- name: Create test branch and update dependencies
id: create-pr
run: |
cd transformers
cd ${{ matrix.target-repo }}
VERSION=${{ steps.get-version.outputs.VERSION }}
BRANCH_NAME="ci-test-huggingface-hub-${VERSION}"
# Create and checkout new branch
git checkout -b $BRANCH_NAME
# Update dependencies using sed
sed -i -E 's/"huggingface-hub>=0.*"/"huggingface-hub=='${VERSION}'"/' setup.py
sed -i -E 's/"huggingface-hub>=0.*"/"huggingface-hub=='${VERSION}'"/' src/transformers/dependency_versions_table.py
sed -i -E "s/\"huggingface-hub>=0.*\"/\"huggingface-hub==${VERSION}\"/" setup.py
git add setup.py
# Only if the target repo is transformers
if [ "${{ matrix.target-repo }}" = "transformers" ]; then
sed -i -E "s/\"huggingface-hub>=0.*\"/\"huggingface-hub==${VERSION}\"/" src/transformers/dependency_versions_table.py
git add src/transformers/dependency_versions_table.py
fi
# Only if the target repo is diffusers
if [ "${{ matrix.target-repo }}" = "diffusers" ]; then
sed -i -E "s/\"huggingface-hub\":.*/\"huggingface-hub\": \"huggingface-hub==${VERSION}\",/" src/diffusers/dependency_versions_table.py
git add src/diffusers/dependency_versions_table.py
fi
# Any line with `uv pip install --prerelease=allow` in the `.github/` folder must be updated with `--prerelease=allow` flag
find .github/workflows/ -type f -exec sed -i 's/uv pip install /uv pip install --prerelease=allow /g' {} +
git add .github/workflows/
# Commit and push changes
git --no-pager diff
git add setup.py src/transformers/dependency_versions_table.py
git --no-pager diff --staged
git commit -m "Test hfh ${VERSION}"
git push --set-upstream origin $BRANCH_NAME
- name: Print URLs for manual check
run: |
VERSION=${{ steps.get-version.outputs.VERSION }}
echo "https://github.com/huggingface/transformers/actions"
echo "https://github.com/huggingface/transformers/compare/main...ci-test-huggingface-hub-${VERSION}"
echo "https://github.com/huggingface/${{ matrix.target-repo }}/actions"
echo "https://github.com/huggingface/${{ matrix.target-repo }}/compare/main...ci-test-huggingface-hub-${VERSION}"
12 changes: 10 additions & 2 deletions .github/workflows/python-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ jobs:
[
"Repository only",
"Everything else",
"Inference only"

]
include:
Expand Down Expand Up @@ -64,7 +65,7 @@ jobs:
case "${{ matrix.test_name }}" in
"Repository only" | "Everything else")
"Repository only" | "Everything else" | "Inference only")
sudo apt update
sudo apt install -y libsndfile1-dev
;;
Expand Down Expand Up @@ -112,8 +113,15 @@ jobs:
eval $PYTEST
;;
"Inference only")
# Run inference tests concurrently
PYTEST="$PYTEST ../tests -k 'test_inference' -n 4"
echo $PYTEST
eval $PYTEST
;;
"Everything else")
PYTEST="$PYTEST ../tests -k 'not TestRepository' -n 4"
PYTEST="$PYTEST ../tests -k 'not TestRepository and not test_inference' -n 4"
echo $PYTEST
eval $PYTEST
;;
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/trufflehog.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,5 @@ jobs:
fetch-depth: 0
- name: Secret Scanning
uses: trufflesecurity/trufflehog@main
with:
extra_args: --results=verified,unknown
2 changes: 1 addition & 1 deletion docs/source/de/guides/inference.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ Das Ziel von [`InferenceClient`] ist es, die einfachste Schnittstelle zum Ausfü
| | [Feature Extraction](https://huggingface.co/tasks/feature-extraction) || [`~InferenceClient.feature_extraction`] |
| | [Fill Mask](https://huggingface.co/tasks/fill-mask) || [`~InferenceClient.fill_mask`] |
| | [Question Answering](https://huggingface.co/tasks/question-answering) || [`~InferenceClient.question_answering`] |
| | [Sentence Similarity](https://huggingface.co/tasks/sentence-similarity) || [`~InferenceClient.sentence_similarity`] |
| | [Sentence Similarity](https://huggingface.co/tasks/sentence-similarity) || [`~InferenceClient.sentence_similarity`] |
| | [Summarization](https://huggingface.co/tasks/summarization) || [`~InferenceClient.summarization`] |
| | [Table Question Answering](https://huggingface.co/tasks/table-question-answering) || [`~InferenceClient.table_question_answering`] |
| | [Text Classification](https://huggingface.co/tasks/text-classification) || [`~InferenceClient.text_classification`] |
Expand Down
60 changes: 30 additions & 30 deletions docs/source/en/guides/inference.md
Original file line number Diff line number Diff line change
Expand Up @@ -248,36 +248,36 @@ You might wonder why using [`InferenceClient`] instead of OpenAI's client? There

[`InferenceClient`]'s goal is to provide the easiest interface to run inference on Hugging Face models, on any provider. It has a simple API that supports the most common tasks. Here is a table showing which providers support which tasks:

| Domain | Task | HF Inference | Replicate | fal-ai | Sambanova | Together |
| ------------------- | --------------------------------------------------- | ------------ | --------- | ------ | --------- | -------- |
| **Audio** | [`~InferenceClient.audio_classification`] || ||||
| | [`~InferenceClient.audio_to_audio`] || ||||
| | [`~InferenceClient.automatic_speech_recognition`] ||| |||
| | [`~InferenceClient.text_to_speech`] | |||||
| **Computer Vision** | [`~InferenceClient.image_classification`] || ||||
| | [`~InferenceClient.image_segmentation`] || ||||
| | [`~InferenceClient.image_to_image`] || ||||
| | [`~InferenceClient.image_to_text`] || ||||
| | [`~InferenceClient.object_detection`] || ||||
| | [`~InferenceClient.text_to_image`] ||||||
| | [`~InferenceClient.text_to_video`] || | |||
| | [`~InferenceClient.zero_shot_image_classification`] || ||||
| **Multimodal** | [`~InferenceClient.document_question_answering`] || ||||
| | [`~InferenceClient.visual_question_answering`] || ||||
| **NLP** | [`~InferenceClient.chat_completion`] || | |||
| | [`~InferenceClient.feature_extraction`] || ||||
| | [`~InferenceClient.fill_mask`] || ||||
| | [`~InferenceClient.question_answering`] || ||||
| | [`~InferenceClient.sentence_similarity`] || ||||
| | [`~InferenceClient.summarization`] || ||||
| | [`~InferenceClient.table_question_answering`] || ||||
| | [`~InferenceClient.text_classification`] || ||||
| | [`~InferenceClient.text_generation`] || | |||
| | [`~InferenceClient.token_classification`] || ||||
| | [`~InferenceClient.translation`] || ||||
| | [`~InferenceClient.zero_shot_classification`] || ||||
| **Tabular** | [`~InferenceClient.tabular_classification`] || ||||
| | [`~InferenceClient.tabular_regression`] || ||||
| Domain | Task | Black Forest Labs | HF Inference | fal-ai | Fireworks AI | Hyperbolic | Nebius AI Studio | Novita AI | Replicate | Sambanova | Together |
| ------------------- | --------------------------------------------------- | ---------------- | ------------ | ------ | --------- | ---------- | ---------------- | ------ | --------- | --------- | ----------- |
| **Audio** | [`~InferenceClient.audio_classification`] | | ||| | | ||| |
| | [`~InferenceClient.audio_to_audio`] | | ||| | | ||| |
| | [`~InferenceClient.automatic_speech_recognition`] | | | | | | | ||| |
| | [`~InferenceClient.text_to_speech`] | | ||| ||||| |
| **Computer Vision** | [`~InferenceClient.image_classification`] | | ||| | | ||| |
| | [`~InferenceClient.image_segmentation`] | | ||| | | ||| |
| | [`~InferenceClient.image_to_image`] | | ||| | | ||| |
| | [`~InferenceClient.image_to_text`] | | ||| | | ||| |
| | [`~InferenceClient.object_detection`] | | ||| | | ||| |
| | [`~InferenceClient.text_to_image`] | | ||| ||||| |
| | [`~InferenceClient.text_to_video`] | || ||||| || |
| | [`~InferenceClient.zero_shot_image_classification`] | | ||| | | ||| |
| **Multimodal** | [`~InferenceClient.document_question_answering`] | | ||| | | ||| |
| | [`~InferenceClient.visual_question_answering`] | | ||| | | ||| |
| **NLP** | [`~InferenceClient.chat_completion`] | | |||||| || |
| | [`~InferenceClient.feature_extraction`] | | ||| | | ||| |
| | [`~InferenceClient.fill_mask`] | | ||| | | ||| |
| | [`~InferenceClient.question_answering`] | | ||| | | ||| |
| | [`~InferenceClient.sentence_similarity`] | | ||| | | ||| |
| | [`~InferenceClient.summarization`] | | ||| | | ||| |
| | [`~InferenceClient.table_question_answering`] | | ||| | | ||| |
| | [`~InferenceClient.text_classification`] | | ||| | | ||| |
| | [`~InferenceClient.text_generation`] | | |||||| || |
| | [`~InferenceClient.token_classification`] | | ||| | | ||| |
| | [`~InferenceClient.translation`] | | ||| | | ||| |
| | [`~InferenceClient.zero_shot_classification`] | | ||| | | ||| |
| **Tabular** | [`~InferenceClient.tabular_classification`] | | ||| | | ||| |
| | [`~InferenceClient.tabular_regression`] | | ||| | | ||| |

<Tip>

Expand Down
14 changes: 9 additions & 5 deletions docs/source/en/package_reference/environment_variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,10 @@ The following environment variables expect a boolean value. The variable will be
as `True` if its value is one of `{"1", "ON", "YES", "TRUE"}` (case-insensitive). Any other value
(or undefined) will be considered as `False`.

### HF_DEBUG

If set, the log level for the `huggingface_hub` logger is set to DEBUG. Additionally, all requests made by HF libraries will be logged as equivalent cURL commands for easier debugging and reproducibility.

### HF_HUB_OFFLINE

If set, no HTTP calls will be made to the Hugging Face Hub. If you try to download files, only the cached files will be accessed. If no cache file is detected, an error is raised This is useful in case your network is slow and you don't care about having the latest version of a file.
Expand Down Expand Up @@ -159,11 +163,11 @@ Please note that using `hf_transfer` comes with certain limitations. Since it is
In order to standardize all environment variables within the Hugging Face ecosystem, some variables have been marked as deprecated. Although they remain functional, they no longer take precedence over their replacements. The following table outlines the deprecated variables and their corresponding alternatives:


| Deprecated Variable | Replacement |
| --- | --- |
| `HUGGINGFACE_HUB_CACHE` | `HF_HUB_CACHE` |
| `HUGGINGFACE_ASSETS_CACHE` | `HF_ASSETS_CACHE` |
| `HUGGING_FACE_HUB_TOKEN` | `HF_TOKEN` |
| Deprecated Variable | Replacement |
| --------------------------- | ------------------ |
| `HUGGINGFACE_HUB_CACHE` | `HF_HUB_CACHE` |
| `HUGGINGFACE_ASSETS_CACHE` | `HF_ASSETS_CACHE` |
| `HUGGING_FACE_HUB_TOKEN` | `HF_TOKEN` |
| `HUGGINGFACE_HUB_VERBOSITY` | `HF_HUB_VERBOSITY` |

## From external tools
Expand Down
2 changes: 0 additions & 2 deletions docs/source/en/package_reference/inference_types.md
Original file line number Diff line number Diff line change
Expand Up @@ -309,8 +309,6 @@ This part of the lib is still under development and will be improved in future r

[[autodoc]] huggingface_hub.TextToImageParameters

[[autodoc]] huggingface_hub.TextToImageTargetSize



## text_to_speech
Expand Down
Loading

0 comments on commit 1a023f3

Please sign in to comment.