Amazon SageMaker documentation

Available DLCs on AWS

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Available DLCs on AWS

Below you can find a listing of all the Deep Learning Containers (DLCs) available on AWS.

For each supported combination of use-case (training, inference), accelerator type (CPU, GPU, Neuron), and framework (PyTorch, TGI, TEI) containers are created.

Training

Pytorch Training DLC: For training, our DLCs are available for PyTorch via Transformers. They include support for training on GPUs and AWS AI chips with libraries such as TRL, Sentence Transformers, or Diffusers.

You can also keep track of the latest Pytorch Training DLC releases here.

Container URI Accelerator
763104351884.dkr.ecr.us-east-1.amazonaws.com/huggingface-pytorch-training:2.5.1-transformers4.49.0-gpu-py311-cu124-ubuntu22.04 GPU
763104351884.dkr.ecr.us-east-1.amazonaws.com/huggingface-pytorch-training-neuronx:2.1.2-transformers4.48.1-neuronx-py310-sdk2.20.0-ubuntu20.04 Neuron

Inference

Pytorch Inference DLC

For inference, we have a general-purpose PyTorch inference DLC, for serving models trained with any of those frameworks mentioned before on CPU, GPU, and AWS AI chips.

You can also keep track of the latest Pytorch Inference DLC releases here.

Container URI Accelerator
763104351884.dkr.ecr.us-east-1.amazonaws.com/huggingface-pytorch-inference:2.6.0-transformers4.49.0-cpu-py312-ubuntu22.04- CPU
763104351884.dkr.ecr.us-east-1.amazonaws.com/huggingface-pytorch-inference:2.6.0-transformers4.49.0-gpu-py312-cu124-ubuntu22.04 GPU
763104351884.dkr.ecr.us-east-1.amazonaws.com/huggingface-pytorch-inference-neuronx:2.1.2-transformers4.43.2-neuronx-py310-sdk2.20.0-ubuntu20.04 Neuron

LLM TGI

There is also the LLM Text Generation Inference (TGI) DLC for high-performance text generation of LLMs on GPU and AWS AI chips.

You can also keep track of the latest LLM TGI DLC releases here.

Container URI Accelerator
763104351884.dkr.ecr.us-east-1.amazonaws.com/huggingface-pytorch-tgi-inference:2.6.0-tgi3.2.3-gpu-py311-cu124-ubuntu22.04 GPU
763104351884.dkr.ecr.us-east-1.amazonaws.com/huggingface-pytorch-tgi-inference:2.1.2-optimum0.0.28-neuronx-py310-ubuntu22.04 Neuron

Text Embedding Inference

Finally, there is a Text Embeddings Inference (TEI) DLC for high-performance serving of embedding models on CPU and GPU.

Container URI Accelerator
683313688378.dkr.ecr.us-east-1.amazonaws.com/tei-cpu:2.0.1-tei1.2.3-cpu-py310-ubuntu22.04 CPU
683313688378.dkr.ecr.us-east-1.amazonaws.com/tei:2.0.1-tei1.4.0-gpu-py310-cu122-ubuntu22.04 GPU

FAQ

How to choose the right inference container for my use case?

inference-dlc-decision-tree

Note: See here for the list of supported task in the inference toolkit.

Note: Browse through the Hub to see if you model is tagged “text-generation-inference” or “text-embeddings-inference”

How to find the URI of my container?

The URI is built with an AWS account ID and an AWS region. Those two values need to be replaced depending on your use case. Let’s say you want to use the training DLC for GPUs in

  • dlc-aws-account-id: The AWS account ID of the account that owns the ECR repository. You can find them in the here
  • region: The AWS region where you want to use it.

How to find the URI of my container but simpler?

The Python SagemMaker SDK util functions are not always up to date but it is much simpler than reconstructing the image URI yourself.

from sagemaker.huggingface import HuggingFaceModel, get_huggingface_llm_image_uri

print(f"TGI GPU: {get_huggingface_llm_image_uri('huggingface')}")
print(f"TEI GPU: {get_huggingface_llm_image_uri('huggingface-tei')}")
print(f"TEI CPU: {get_huggingface_llm_image_uri('huggingface-tei-cpu')}")
print(f"TGI Neuron: {get_huggingface_llm_image_uri('huggingface-neuronx')}")

For Pytorch Training and Pytorch Inference DLCs, there is no such utility.

< > Update on GitHub