Skip to content

Commit a774d6a

Browse files
author
Tyler Titsworth
authored
Merge branch 'main' into tylertitsworth/open-reqs
2 parents 453204e + 18f2760 commit a774d6a

36 files changed

+838
-67
lines changed

.github/workflows/container-ci.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -155,7 +155,7 @@ jobs:
155155
- name: Cleanup
156156
if: always()
157157
run: docker rmi -f ${{ secrets.REGISTRY }}/${{ secrets.REPO }}:${{ matrix.container }}
158-
- uses: github/codeql-action/upload-sarif@429e1977040da7a23b6822b13c129cd1ba93dbb2 # v3.26.2
158+
- uses: github/codeql-action/upload-sarif@2c779ab0d087cd7fe7b826087247c2c81f27bfa6 # v3.26.5
159159
with:
160160
sarif_file: '${{ matrix.container }}-scan.sarif'
161161
category: '${{ matrix.container }}'

.github/workflows/scorecard.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -53,6 +53,6 @@ jobs:
5353
name: SARIF file
5454
path: results.sarif
5555
retention-days: 5
56-
- uses: github/codeql-action/upload-sarif@429e1977040da7a23b6822b13c129cd1ba93dbb2 # v3.26.2
56+
- uses: github/codeql-action/upload-sarif@2c779ab0d087cd7fe7b826087247c2c81f27bfa6 # v3.26.5
5757
with:
5858
sarif_file: results.sarif

CONTRIBUTING.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Contributing
22

3-
Thank you for considering contributing to Intel® AI Containers! We welcome your help to make this project better. Contributing to an open source project can be a daunting task, but the Intel AI Containers team is here to help you through the process. If at any point in this process you feel out of your depth or confused by our processes, please don't hesitate to reach out to a maintainer or file an [issue](https://github.com/intel/ai-containers/issues).
3+
Thank you for considering contributing to AI Containers! We welcome your help to make this project better. Contributing to an open source project can be a daunting task, but the Intel AI Containers team is here to help you through the process. If at any point in this process you feel out of your depth or confused by our processes, please don't hesitate to reach out to a maintainer or file an [issue](https://github.com/intel/ai-containers/issues).
44

55
## Getting Started
66

@@ -138,4 +138,4 @@ commit automatically with `git commit -s`.
138138

139139
## License
140140

141-
Intel® AI Containers is licensed under the terms in [LICENSE](./LICENSE). By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
141+
AI Containers is licensed under the terms in [LICENSE](./LICENSE). By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Intel® AI Containers
1+
# AI Containers
22

33
[![OpenSSF Best Practices](https://www.bestpractices.dev/projects/8270/badge)](https://www.bestpractices.dev/projects/8270)
44
[![OpenSSF Scorecard](https://api.securityscorecards.dev/projects/github.com/intel/ai-containers/badge)](https://securityscorecards.dev/viewer/?uri=github.com/intel/ai-containers)
@@ -28,7 +28,7 @@ docker login $REGISTRY
2828
docker pull $REGISTRY/$REPO:latest
2929
```
3030

31-
The maintainers of Intel® AI Containers use Azure to store containers, but an open source container registry like [harbor](https://github.com/goharbor/harbor) is preferred.
31+
The maintainers of AI Containers use Azure to store containers, but an open source container registry like [harbor](https://github.com/goharbor/harbor) is preferred.
3232

3333
> [!WARNING]
3434
> You can optionally skip this step and use some placeholder values, however some container groups depend on other images and will pull from a registry that you have not defined and result in an error.

classical-ml/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ The images below additionally include [Jupyter Notebook](https://jupyter.org/) s
6363

6464
## Build from Source
6565

66-
To build the images from source, clone the [Intel® AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
66+
To build the images from source, clone the [AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
6767

6868
```bash
6969
cd classical-ml

docs/requirements.txt

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
mkdocs-callouts>=1.13.2
22
mkdocs-git-authors-plugin>=0.8.0
33
mkdocs-git-revision-date-localized-plugin>=1.2.5
4-
mkdocs-material==9.5.32
4+
mkdocs-material==9.5.33
55
mkdocs-table-reader-plugin>=2.1.0
66
mkdocs==1.6.0
77
pandas>=2.0.3

mkdocs.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ plugins:
5151
- read_csv
5252
repo_name: intel/ai-containers
5353
repo_url: https://github.com/intel/ai-containers
54-
site_name: Intel® AI Containers
54+
site_name: AI Containers
5555
#TODO: Get previous container versions in an easy way
5656
# https://squidfunk.github.io/mkdocs-material/setup/setting-up-versioning/
5757
theme:

python/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ The images below include variations for only the core packages in the [Intel® D
1515

1616
## Build from Source
1717

18-
To build the images from source, clone the [Intel® AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
18+
To build the images from source, clone the [AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
1919

2020
```bash
2121
cd python

pytorch/Dockerfile

+90-14
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,7 @@ ARG BASE_IMAGE_TAG
3434
ARG PACKAGE_OPTION=pip
3535
ARG PYTHON_VERSION
3636
ARG PYTHON_BASE=${REGISTRY}/${REPO}:b-${GITHUB_RUN_NUMBER}-${BASE_IMAGE_NAME}-${BASE_IMAGE_TAG}-${PACKAGE_OPTION}-py${PYTHON_VERSION}-base
37+
ARG TORCHSERVE_BASE=${PYTHON_BASE}
3738
FROM ${PYTHON_BASE} AS ipex-base-pip
3839

3940
WORKDIR /
@@ -181,13 +182,17 @@ RUN apt-get update && \
181182
intel-oneapi-runtime-mkl=${MKL_VER} \
182183
intel-oneapi-runtime-ccl=${CCL_VER};
183184

185+
RUN rm -rf /etc/apt/sources.list.d/intel-gpu-jammy.list /etc/apt/sources.list.d/oneAPI.list
186+
187+
ENV LD_LIBRARY_PATH=/opt/intel/oneapi/redist/lib:$LD_LIBRARY_PATH
188+
189+
FROM ipex-xpu-base AS ipex-xpu-base-wheels
190+
184191
WORKDIR /
185192
COPY xpu-requirements.txt .
186193

187194
RUN python -m pip install --no-cache-dir -r xpu-requirements.txt && \
188-
rm -rf xpu-requirements.txt /etc/apt/sources.list.d/intel-gpu-jammy.list /etc/apt/sources.list.d/oneAPI.list
189-
190-
ENV LD_LIBRARY_PATH=/opt/intel/oneapi/redist/lib:$LD_LIBRARY_PATH
195+
rm -rf xpu-requirements.txt
191196

192197
FROM ipex-xpu-base AS ipex-xpu-jupyter
193198

@@ -205,7 +210,8 @@ EXPOSE 8888
205210

206211
CMD ["bash", "-c", "source /etc/bash.bashrc && jupyter notebook --notebook-dir=/jupyter --port 8888 --ip 0.0.0.0 --no-browser --allow-root --ServerApp.token= --ServerApp.password= --ServerApp.allow_origin=* --ServerApp.base_url=$NB_PREFIX"]
207212

208-
FROM ${PYTHON_BASE} as torchserve-base
213+
214+
FROM ${TORCHSERVE_BASE} as torchserve-base
209215

210216
ENV PYTHONUNBUFFERED=TRUE
211217

@@ -221,8 +227,6 @@ RUN useradd -m -s /bin/bash model-server && \
221227
mkdir -p /home/model-server/model-store && \
222228
chown -R model-server /home/model-server/
223229

224-
FROM torchserve-base AS compile
225-
226230
RUN apt-get update -y && apt-get install -y --no-install-recommends --fix-missing \
227231
g++ \
228232
git \
@@ -237,11 +241,6 @@ RUN python3 -m venv /home/venv
237241
ENV PATH="/home/venv/bin:$PATH"
238242

239243
WORKDIR /home/model-server
240-
COPY torchserve-requirements.txt .
241-
COPY requirements.txt .
242-
243-
RUN python -m pip install --no-cache-dir -r requirements.txt && \
244-
python -m pip install --no-cache-dir -r torchserve-requirements.txt
245244

246245
RUN echo -e "#!/bin/bash \n\
247246
set -e \n\
@@ -253,13 +252,29 @@ else \n\
253252
fi \n\
254253
tail -f /dev/null" >> /usr/local/bin/dockerd-entrypoint.sh
255254

256-
FROM torchserve-base AS torchserve
255+
FROM torchserve-base AS compile-cpu
256+
257+
COPY serving/torchserve-requirements.txt .
258+
COPY requirements.txt .
259+
260+
RUN python -m pip install --no-cache-dir -r requirements.txt && \
261+
python -m pip install --no-cache-dir -r torchserve-requirements.txt && \
262+
rm -rf requirements.txt torchserve-requirements.txt
263+
264+
FROM torchserve-base AS compile-xpu
265+
266+
COPY serving/torchserve-xpu-requirements.txt .
267+
268+
RUN python -m pip install --no-cache-dir -r torchserve-xpu-requirements.txt && \
269+
rm -rf torchserve-xpu-requirements.txt
270+
271+
FROM torchserve-base AS torchserve-cpu
257272

258273
USER model-server
259274
WORKDIR /home/model-server
260275

261-
COPY --chown=model-server --from=compile /home/venv /home/venv
262-
COPY --chown=model-server --chmod=755 --from=compile /usr/local/bin/dockerd-entrypoint.sh /usr/local/bin/dockerd-entrypoint.sh
276+
COPY --chown=model-server --from=compile-cpu /home/venv /home/venv
277+
COPY --chown=model-server --chmod=755 --from=compile-cpu /usr/local/bin/dockerd-entrypoint.sh /usr/local/bin/dockerd-entrypoint.sh
263278
COPY --chown=model-server serving/config.properties /home/model-server/config.properties
264279

265280
ENV PATH="/home/venv/bin:$PATH"
@@ -270,3 +285,64 @@ EXPOSE 8080 8081 8082 7070 7071
270285

271286
ENTRYPOINT ["/usr/local/bin/dockerd-entrypoint.sh"]
272287
CMD ["serve"]
288+
289+
FROM torchserve-base AS torchserve-xpu
290+
291+
RUN apt-get update && \
292+
apt-get install -y --no-install-recommends --fix-missing \
293+
gnupg2 \
294+
gpg-agent \
295+
rsync && \
296+
apt-get clean && \
297+
rm -rf /var/lib/apt/lists/*
298+
299+
RUN wget -qO - https://repositories.intel.com/gpu/intel-graphics.key | \
300+
gpg --dearmor --yes --output /usr/share/keyrings/intel-graphics.gpg
301+
RUN echo "deb [arch=amd64 signed-by=/usr/share/keyrings/intel-graphics.gpg] https://repositories.intel.com/gpu/ubuntu jammy unified" | \
302+
tee /etc/apt/sources.list.d/intel-gpu-jammy.list
303+
304+
RUN apt-get update && \
305+
apt-get install -y --no-install-recommends \
306+
jq \
307+
curl \
308+
libnl-genl-3-200 \
309+
intel-gsc \
310+
libdrm2 \
311+
intel-metrics-discovery \
312+
intel-metrics-library && \
313+
apt-get autoremove -y && \
314+
rm -rf /var/lib/apt/lists/*
315+
316+
ARG XPU_SMI_VERSION
317+
318+
ARG API_URL=https://api.github.com/repos/intel/xpumanager/releases/tags/V${XPU_SMI_VERSION}
319+
320+
RUN wget -q --header="Accept: application/vnd.github.v3+json" --header="User-Agent: MyClient/1.0.0" -O - "$API_URL" | tee /tmp/asset_data.txt && \
321+
wget -q --no-check-certificate "$(jq -r '.assets[] | select(.name | test("^xpu-smi.*u22\\.04_amd64\\.deb$")) | .browser_download_url' < /tmp/asset_data.txt)" && \
322+
ldconfig && dpkg -i --force-all -- *.deb && \
323+
rm -rf -- *.deb /etc/apt/sources.list.d/intel-gpu-jammy.list /etc/apt/sources.list.d/oneAPI.list /tmp/asset_data.txt
324+
325+
ARG GID=109
326+
327+
RUN groupadd -g ${GID} render &&\
328+
usermod -aG video,render model-server
329+
330+
USER model-server
331+
332+
WORKDIR /home/model-server
333+
334+
RUN wget --progress=dot:giga https://raw.githubusercontent.com/pytorch/serve/master/examples/intel_extension_for_pytorch/intel_gpu_metric_collector.py && \
335+
wget --progress=dot:giga https://raw.githubusercontent.com/pytorch/serve/master/examples/intel_extension_for_pytorch/intel_gpu.py
336+
337+
COPY --chown=model-server --from=compile-xpu /home/venv /home/venv
338+
COPY --chown=model-server --chmod=755 --from=compile-xpu /usr/local/bin/dockerd-entrypoint.sh /usr/local/bin/dockerd-entrypoint.sh
339+
COPY --chown=model-server serving/config-xpu.properties /home/model-server/config.properties
340+
341+
ENV PATH="/home/venv/bin:$PATH"
342+
ENV TEMP=/home/model-server/tmp
343+
344+
# 8080/8081/8082 REST and 7070/7071 gRPC
345+
EXPOSE 8080 8081 8082 7070 7071
346+
347+
ENTRYPOINT ["/usr/local/bin/dockerd-entrypoint.sh"]
348+
CMD ["serve"]

pytorch/README.md

+1-3
Original file line numberDiff line numberDiff line change
@@ -241,8 +241,6 @@ Additionally, if you have a [DeepSpeed* configuration](https://www.deepspeed.ai/
241241

242242
---
243243

244-
#### Hugging Face Generative AI Container
245-
246244
The image below is an extension of the IPEX Multi-Node Container designed to run Hugging Face Generative AI scripts. The container has the typical installations needed to run and fine tune PyTorch generative text models from Hugging Face. It can be used to run multinode jobs using the same instructions from the [IPEX Multi-Node container](#setup-and-run-ipex-multi-node-container).
247245

248246
| Tag(s) | Pytorch | IPEX | oneCCL | HF Transformers | Dockerfile |
@@ -324,7 +322,7 @@ The images below additionally include [Jupyter Notebook](https://jupyter.org/) s
324322

325323
## Build from Source
326324

327-
To build the images from source, clone the [Intel® AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
325+
To build the images from source, clone the [AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
328326

329327
```bash
330328
cd pytorch

pytorch/docker-compose.yaml

+26-5
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ services:
122122
org.opencontainers.base.name: "intel/python:3.10-core"
123123
org.opencontainers.image.title: "Intel® Extension for PyTorch XPU Base Image"
124124
org.opencontainers.image.version: ${IPEX_VERSION:-2.1.40}-xpu-${PACKAGE_OPTION:-pip}-base
125-
target: ipex-xpu-base
125+
target: ipex-xpu-base-wheels
126126
command: >
127127
python -c "import torch;print(torch.device('xpu'));import
128128
intel_extension_for_pytorch as
@@ -156,7 +156,7 @@ services:
156156
image: ${REGISTRY}/${REPO}:b-${GITHUB_RUN_NUMBER:-0}-${BASE_IMAGE_NAME:-ubuntu}-${BASE_IMAGE_TAG:-22.04}-${PACKAGE_OPTION:-pip}-py${PYTHON_VERSION:-3.10}-ipex-${IPEX_VERSION:-2.1.40xpu}-xpu-jupyter
157157
ports:
158158
- 8888:8888
159-
torchserve:
159+
torchserve-cpu:
160160
build:
161161
args:
162162
PACKAGE_OPTION: pip
@@ -165,22 +165,43 @@ services:
165165
dependency.apt.openjdk-17-jdk: true
166166
dependency.idp: false
167167
dependency.python.ipex: requirements.txt
168-
dependency.python.pip: torchserve-requirements.txt
168+
dependency.python.pip: serving/torchserve-requirements.txt
169169
docs: serving
170170
org.opencontainers.base.name: "intel/python:3.10-core"
171171
org.opencontainers.image.title: "Intel® Extension for PyTorch Serving Image"
172172
org.opencontainers.image.version: ${IPEX_VERSION:-2.4.0}-serving-cpu
173-
target: torchserve
173+
target: torchserve-cpu
174174
command: torchserve --version
175175
entrypoint: ""
176176
extends: ipex-base
177-
image: ${REGISTRY}/${REPO}:b-${GITHUB_RUN_NUMBER:-0}-${BASE_IMAGE_NAME:-ubuntu}-${BASE_IMAGE_TAG:-22.04}-py${PYTHON_VERSION:-3.10}-torchserve
177+
image: ${REGISTRY}/${REPO}:b-${GITHUB_RUN_NUMBER:-0}-${BASE_IMAGE_NAME:-ubuntu}-${BASE_IMAGE_TAG:-22.04}-py${PYTHON_VERSION:-3.10}-torchserve-cpu
178178
ports:
179179
- 8080:8080
180180
- 8081:8081
181181
- 8082:8082
182182
- 7070:7070
183183
- 7071:7071
184+
torchserve-xpu:
185+
build:
186+
args:
187+
PACKAGE_OPTION: pip
188+
XPU_SMI_VERSION: ${XPU_SMI_VERSION:-1.2.38}
189+
TORCHSERVE_BASE: ipex-xpu-base
190+
labels:
191+
dependency.apt.numactl: true
192+
dependency.apt.openjdk-17-jdk: true
193+
dependency.apt.xpu-smi: ${XPU_SMI_VERSION:-1.2.38}
194+
dependency.idp: false
195+
dependency.python.pip: serving/torchserve-xpu-requirements.txt
196+
docs: serving
197+
org.opencontainers.base.name: "intel/python:3.10-core"
198+
org.opencontainers.image.title: "Intel® Extension for PyTorch XPU Serving Image"
199+
org.opencontainers.image.version: ${IPEX_VERSION:-2.1.40}-serving-xpu
200+
target: torchserve-xpu
201+
command: torchserve --version
202+
entrypoint: ""
203+
extends: xpu
204+
image: ${REGISTRY}/${REPO}:b-${GITHUB_RUN_NUMBER:-0}-${BASE_IMAGE_NAME:-ubuntu}-${BASE_IMAGE_TAG:-22.04}-py${PYTHON_VERSION:-3.10}-torchserve-xpu
184205
hf-genai:
185206
build:
186207
args:

0 commit comments

Comments
 (0)