Tiny Agents

community
Activity Feed

AI & ML interests

None defined yet.

Recent Activity

WauplinΒ 

Create agent.json

1
#6 opened 24 days ago by
hmnth1
WauplinΒ 

Upload agent.json

1
#5 opened about 1 month ago by
ogundipe72

Create trending/agent.json

#3 opened about 1 month ago by
evalstate

Create trending/PROMPT.md

#4 opened about 1 month ago by
evalstate
victorΒ 
posted an update 30 days ago
view post
Post
2941
Open Source Avengers, Assemble! Ask an expert AI agent team to solve complex problems together πŸ”₯

Consilium brings together multiple agents that debate and use live research (web, arXiv, SEC) to reach a consensus. You set the strategy, they find the answer.

Credit to @azettl for this awesome demo: Agents-MCP-Hackathon/consilium_mcp
  • 2 replies
Β·
julien-cΒ 
in tiny-agents/tiny-agents about 1 month ago
celinahΒ 
posted an update about 2 months ago
view post
Post
2303
✨ Today we’re releasing Tiny Agents in Python β€” an MCP-powered Agent in ~70 lines of code 🐍

Inspired by Tiny Agents in JS from @julien-c , we ported the idea to Python and integrated it directly into huggingface_hub β€” with a built-in MCP Client and a Tiny Agents CLI.

TL;DR: With MCP (Model Context Protocol), you can expose tools like web search or image generation and connect them directly to LLMs. It’s simple β€” and surprisingly powerful.

pip install "huggingface_hub[mcp]>=0.32.0"

We wrote a blog post where we show how to run Tiny Agents, and dive deeper into how they work and how to build your own.
πŸ‘‰ https://huggingface.co/blog/python-tiny-agents

  • 1 reply
Β·
julien-cΒ 
posted an update 3 months ago
view post
Post
5539
BOOOOM: Today I'm dropping TINY AGENTS

the 50 lines of code Agent in Javascript πŸ”₯

I spent the last few weeks working on this, so I hope you will like it.

I've been diving into MCP (Model Context Protocol) to understand what the hype was all about.

It is fairly simple, but still quite powerful: MCP is a standard API to expose sets of Tools that can be hooked to LLMs.

But while doing that, came my second realization:

Once you have a MCP Client, an Agent is literally just a while loop on top of it. 🀯

➑️ read it exclusively on the official HF blog: https://huggingface.co/blog/tiny-agents
  • 1 reply
Β·
victorΒ 
posted an update 3 months ago
view post
Post
4893
DIA TTS is just amazing - please share your funniest gens (here is mine) πŸ˜‚
nari-labs/Dia-1.6B
WauplinΒ 
posted an update 3 months ago
view post
Post
2257
‼️ huggingface_hub's v0.30.0 is out with our biggest update of the past two years!

Full release notes: https://github.com/huggingface/huggingface_hub/releases/tag/v0.30.0.

πŸš€ Ready. Xet. Go!

Xet is a groundbreaking new protocol for storing large objects in Git repositories, designed to replace Git LFS. Unlike LFS, which deduplicates files, Xet operates at the chunk levelβ€”making it a game-changer for AI builders collaborating on massive models and datasets. Our Python integration is powered by [xet-core](https://github.com/huggingface/xet-core), a Rust-based package that handles all the low-level details.

You can start using Xet today by installing the optional dependency:

pip install -U huggingface_hub[hf_xet]


With that, you can seamlessly download files from Xet-enabled repositories! And don’t worryβ€”everything remains fully backward-compatible if you’re not ready to upgrade yet.

Blog post: https://huggingface.co/blog/xet-on-the-hub
Docs: https://huggingface.co/docs/hub/en/storage-backends#xet


⚑ Inference Providers

- We’re thrilled to introduce Cerebras and Cohere as official inference providers! This expansion strengthens the Hub as the go-to entry point for running inference on open-weight models.

- Novita is now our 3rd provider to support text-to-video task after Fal.ai and Replicate.

- Centralized billing: manage your budget and set team-wide spending limits for Inference Providers! Available to all Enterprise Hub organizations.

from huggingface_hub import InferenceClient
client = InferenceClient(provider="fal-ai", bill_to="my-cool-company")
image = client.text_to_image(
    "A majestic lion in a fantasy forest",
    model="black-forest-labs/FLUX.1-schnell",
)
image.save("lion.png")


- No more timeouts when generating videos, thanks to async calls. Available right now for Fal.ai, expecting more providers to leverage the same structure very soon!
Β·
julien-cΒ 
posted an update 4 months ago
view post
Post
4001
Important notice 🚨

For Inference Providers who have built support for our Billing API (currently: Fal, Novita, HF-Inference – with more coming soon), we've started enabling Pay as you go (=PAYG)

What this means is that you can use those Inference Providers beyond the free included credits, and they're charged to your HF account.

You can see it on this view: any provider that does not have a "Billing disabled" badge, is PAYG-compatible.
Β·
victorΒ 
posted an update 5 months ago
view post
Post
6210
Hey everyone, we've given https://hf.co/spaces page a fresh update!

Smart Search: Now just type what you want to doβ€”like "make a viral meme" or "generate music"β€”and our search gets it.

New Categories: Check out the cool new filter bar with icons to help you pick a category fast.

Redesigned Space Cards: Reworked a bit to really show off the app descriptions, so you know what each Space does at a glance.

Random Prompt: Need ideas? Hit the dice button for a burst of inspiration.

We’d love to hear what you thinkβ€”drop us some feedback plz!
Β·
victorΒ 
posted an update 5 months ago
view post
Post
3309
Finally, an open-source AI that turns your lyrics into full songs is hereβ€”meet YuE! Unlike other tools that only create short clips, YuE can make entire songs (up to 5 minutes) with vocals, melody, and instruments all working together. Letsss go!

m-a-p/YuE-s1-7B-anneal-en-cot
celinahΒ 
posted an update 7 months ago
view post
Post
792
πŸš€ We've just dropped a new release v0.27.0 of the πš‘πšžπšπšπš’πš—πšπšπšŠπšŒπšŽ_πš‘πšžπš‹ Python library!

This release includes:
- πŸ’Ύ New torch model loading utilities in the serialization module β€” providing a standardized way to save and load torch models with built-in support for sharding and safe serialization.
- πŸ“¦ Tooling for something exciting β€” if you like single-file formats for models like GGUF, you'll love what we're cooking up πŸ‘€ More coming soon!
- πŸ› οΈ Loads of quality-of-life improvements and bug fixes!

release notes and full details here πŸ‘‡
Wauplin/huggingface_hub#10

$ pip install -U huggingface_hub