README / README.md
johannaSommer's picture
Update README.md
59155e9 verified
---
title: README
emoji: 🌍
colorFrom: gray
colorTo: purple
sdk: static
pinned: false
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<a href="https://www.pruna.ai/" target="_blank" rel="noopener noreferrer">
<img src="https://imgur.com/rVAgqMY.png" alt="PrunaAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</a>
</div>
<!-- header end -->
----
# 🌍 Join the Pruna AI community!
[![Twitter](https://img.shields.io/twitter/follow/PrunaAI?style=social)](https://twitter.com/PrunaAI)
[![GitHub](https://img.shields.io/github/stars/prunaai/pruna)](https://github.com/PrunaAI/pruna)
[![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.com/invite/rskEr4BZJx)
[![Reddit](https://img.shields.io/reddit/subreddit-subscribers/PrunaAI?style=social)](https://www.reddit.com/r/PrunaAI/)
----
# πŸ’œ Simply make AI models faster, cheaper, smaller, greener!
[Pruna AI](https://www.pruna.ai/) makes AI models faster, cheaper, smaller, greener with the `pruna` package.
- It supports **various models including CV, NLP, audio, graphs for predictive and generative AI**.
- It supports **various hardware including GPU, CPU, Edge**.
- It supports **various compression algortihms including quantization, pruning, distillation, caching, recovery, compilation** that can be **combined together**.
- You can either **play on your own** with smash/compression configurations or **let the smashing/compressing agent** find the optimal configuration **[Pro]**.
- You can **evaluate reliable quality and efficiency metrics** of your base vs smashed/compressed models.
You can set it up in minutes and compress your first models in few lines of code!
----
# ⏩ How to get started?
You can smash your own models by installing pruna with pip:
```
pip install pruna
```
or directly [from source](https://github.com/PrunaAI/pruna).
You can start with simple notebooks to experience efficiency gains with:
| Use Case | Free Notebooks |
|------------------------------------------------------------|----------------------------------------------------------------|
| **3x Faster Stable Diffusion Models** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/sd_deepcache.ipynb) |
| **Making your LLMs 4x smaller** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/llms.ipynb) |
| **Smash your model with a CPU only** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/cv_cpu.ipynb) |
| **Transcribe 2 hours of audio in less than 2 minutes with Whisper** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/asr_tutorial.ipynb) |
| **100% faster Whisper Transcription** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/asr_whisper.ipynb) |
| **Run your Flux model without an A100** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/flux_small.ipynb) |
| **x2 smaller Sana in action** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/sana_diffusers_int8.ipynb) |
For more details about installation, free tutorials and Pruna Pro tutorials, you can check the [Pruna AI documentation](https://docs.pruna.ai/).
----