You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: CONTRIBUTING.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# Contributing
2
2
3
-
Thank you for considering contributing to Intel® AI Containers! We welcome your help to make this project better. Contributing to an open source project can be a daunting task, but the Intel AI Containers team is here to help you through the process. If at any point in this process you feel out of your depth or confused by our processes, please don't hesitate to reach out to a maintainer or file an [issue](https://github.com/intel/ai-containers/issues).
3
+
Thank you for considering contributing to AI Containers! We welcome your help to make this project better. Contributing to an open source project can be a daunting task, but the Intel AI Containers team is here to help you through the process. If at any point in this process you feel out of your depth or confused by our processes, please don't hesitate to reach out to a maintainer or file an [issue](https://github.com/intel/ai-containers/issues).
4
4
5
5
## Getting Started
6
6
@@ -138,4 +138,4 @@ commit automatically with `git commit -s`.
138
138
139
139
## License
140
140
141
-
Intel® AI Containers is licensed under the terms in [LICENSE](./LICENSE). By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
141
+
AI Containers is licensed under the terms in [LICENSE](./LICENSE). By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
The maintainers of Intel® AI Containers use Azure to store containers, but an open source container registry like [harbor](https://github.com/goharbor/harbor) is preferred.
31
+
The maintainers of AI Containers use Azure to store containers, but an open source container registry like [harbor](https://github.com/goharbor/harbor) is preferred.
32
32
33
33
> [!WARNING]
34
34
> You can optionally skip this step and use some placeholder values, however some container groups depend on other images and will pull from a registry that you have not defined and result in an error.
Copy file name to clipboardexpand all lines: classical-ml/README.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -63,7 +63,7 @@ The images below additionally include [Jupyter Notebook](https://jupyter.org/) s
63
63
64
64
## Build from Source
65
65
66
-
To build the images from source, clone the [Intel® AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
66
+
To build the images from source, clone the [AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
Copy file name to clipboardexpand all lines: python/README.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -15,7 +15,7 @@ The images below include variations for only the core packages in the [Intel® D
15
15
16
16
## Build from Source
17
17
18
-
To build the images from source, clone the [Intel® AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
18
+
To build the images from source, clone the [AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
Copy file name to clipboardexpand all lines: pytorch/README.md
+1-3
Original file line number
Diff line number
Diff line change
@@ -241,8 +241,6 @@ Additionally, if you have a [DeepSpeed* configuration](https://www.deepspeed.ai/
241
241
242
242
---
243
243
244
-
#### Hugging Face Generative AI Container
245
-
246
244
The image below is an extension of the IPEX Multi-Node Container designed to run Hugging Face Generative AI scripts. The container has the typical installations needed to run and fine tune PyTorch generative text models from Hugging Face. It can be used to run multinode jobs using the same instructions from the [IPEX Multi-Node container](#setup-and-run-ipex-multi-node-container).
@@ -324,7 +322,7 @@ The images below additionally include [Jupyter Notebook](https://jupyter.org/) s
324
322
325
323
## Build from Source
326
324
327
-
To build the images from source, clone the [Intel® AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
325
+
To build the images from source, clone the [AI Containers](https://github.com/intel/ai-containers) repository, follow the main `README.md` file to setup your environment, and run the following command:
0 commit comments