Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
45.5
TFLOPS
160
13
61
Jean Louis
JLouisBiz
Follow
exoplanet's profile picture
marinarosa's profile picture
errah's profile picture
65 followers
·
113 following
https://www.StartYourOwnGoldMine.com
YourOwnGoldMine
gnusupport
AI & ML interests
- LLM for sales, marketing, promotion - LLM for Website Revision System - increasing quality of communication with customers - helping clients access information faster - saving people from financial troubles
Recent Activity
commented
on
an
article
3 days ago
An Analysis of Chinese LLM Censorship and Bias with Qwen 2 Instruct
reacted
to
merve
's
post
with 👍
20 days ago
Now it's possible to do RAG with any-to-any models 🔥 Learn how to search in a video dataset and generate using https://huggingface.co/Tevatron/OmniEmbed-v0.1-multivent an all modality retriever, and https://huggingface.co/Qwen/Qwen2.5-Omni-7B, any-to-any model in this notebook 🤝 https://huggingface.co/merve/smol-vision/blob/main/Any_to_Any_RAG.ipynb
reacted
to
fdaudens
's
post
with 👍
26 days ago
You might not have heard of Moonshot AI — but within 24 hours, their new model Kimi K2 shot to the top of Hugging Face’s trending leaderboard. So… who are they, and why does it matter? Had a lot of fun co-writing this blog post with @xianbao, with key insights translated from Chinese, to unpack how this startup built a model that outperforms GPT-4.1, Claude Opus, and DeepSeek V3 on several major benchmarks. 🧵 A few standout facts: 1. From zero to $3.3B in 18 months: Founded in March 2023, Moonshot is now backed by Alibaba, Tencent, Meituan, and HongShan. 2. A CEO who thinks from the end: Yang Zhilin (31) previously worked at Meta AI, Google Brain, and Carnegie Mellon. His vision? Nothing less than AGI — still a rare ambition among Chinese AI labs. 3. A trillion-parameter model that’s surprisingly efficient: Kimi K2 uses a mixture-of-experts architecture (32B active params per inference) and dominates on coding/math benchmarks. 4. The secret weapon: Muon optimizer: A new training method that doubles efficiency, cuts memory in half, and ran 15.5T tokens with zero failures. Big implications. Most importantly, their move from closed to open source signals a broader shift in China’s AI scene — following Baidu’s pivot. But as Yang puts it: “Users are the only real leaderboard.” 👇 Check out the full post to explore what Kimi K2 can do, how to try it, and why it matters for the future of open-source LLMs: https://huggingface.co/blog/fdaudens/moonshot-ai-kimi-k2-explained
View all activity
Organizations
JLouisBiz
's Spaces
1
Sort:Â Recently updated
Running
1
GNU LLM Integration
🌖
Empowering GNU/Linux users with NLP