| license: apache-2.0 | |
| # AWS Neuron optimum model cache | |
| This repository contains cached neuron compilation artifacts for the most popular models on the Hugging Face Hub. | |
| ## Inference | |
| ### LLM models | |
| For a list of the supported models and configurations, please refer to the inference cache [configuration files](https://huggingface.co/aws-neuron/optimum-neuron-cache/tree/main/inference-cache-config). |