Haghiri PRO

Muhammadreza

AI & ML interests

None yet

Organizations

Muhammadreza's activity

replied to their post 1 day ago
view reply

Well first, we're offering our own models (currently our open models are available here at https://huggingface.co/mann-e, but for proprietary models, you should pay a visit to our website.
Also, we offer crypto payments which are a little more niche compared to the fiat ones.

posted an update 1 day ago
view post
Post
588
Mann-E's new platform is up and running.

You can access our platform here at https://mann-e.com. We're still working on it and reducing the bugs and we also are trying to add a guest session which lets you make images as guests.

What do you think?
  • 3 replies
·
reacted to AdinaY's post with 🔥 2 days ago
view post
Post
2338
Let’s dive into the exciting releases from the Chinese community last week 🔥🚀
More details 👉 https://huggingface.co/zh-ai-community

Code model:
✨Qwen 2.5 coder by Alibaba Qwen
Qwen/qwen25-coder-66eaa22e6f99801bf65b0c2f
✨OpenCoder by InflyAI - Fully open code model🙌
infly/opencoder-672cec44bbb86c39910fb55e

Image model:
✨Hunyuan3D-1.0 by Tencent
tencent/Hunyuan3D-1

MLLM:
✨JanusFlow by DeepSeek
deepseek-ai/JanusFlow-1.3B
deepseek-ai/JanusFlow-1.3B
✨Mono-InternVL-2B by OpenGVlab
OpenGVLab/Mono-InternVL-2B

Video model:
✨CogVideoX 1.5 by ChatGLM
THUDM/CogVideoX1.5-5B-SAT

Audio model:
✨Fish Agent by FishAudio
fishaudio/fish-agent-v0.1-3b

Dataset:
✨OPI dataset by BAAIBeijing
BAAI/OPI
replied to their post 3 days ago
view reply

Well their user account has been gone.

reacted to not-lain's post with 🔥 3 days ago
view post
Post
1169
ever wondered how you can make an API call to a visual-question-answering model without sending an image url 👀

you can do that by converting your local image to base64 and sending it to the API.

recently I made some changes to my library "loadimg" that allows you to make converting images to base64 a breeze.
🔗 https://github.com/not-lain/loadimg

API request example 🛠️:
from loadimg import load_img
from huggingface_hub import InferenceClient

# or load a local image
my_b64_img = load_img(imgPath_url_pillow_or_numpy ,output_type="base64" ) 

client = InferenceClient(api_key="hf_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx")

messages = [
	{
		"role": "user",
		"content": [
			{
				"type": "text",
				"text": "Describe this image in one sentence."
			},
			{
				"type": "image_url",
				"image_url": {
					"url": my_b64_img # base64 allows using images without uploading them to the web
				}
			}
		]
	}
]

stream = client.chat.completions.create(
    model="meta-llama/Llama-3.2-11B-Vision-Instruct", 
	messages=messages, 
	max_tokens=500,
	stream=True
)

for chunk in stream:
    print(chunk.choices[0].delta.content, end="")
posted an update 4 days ago
view post
Post
377
Dear developers and AI enthusiasts, first I apologize for making inappropriate content here.
Second, I don't want to make any form of scandal or drama here, since HF must be a clean and drama-free community.
A few days ago, I made a post and asked for grants or supports you may have received. You can find it here (https://huggingface.co/posts/Muhammadreza/982596282424877). Then user @wisesniper came and linked to their own space which apparently is a cloud mining service.
Although I wanted it to be "not true" but I have to say this person is a scammer. I fell for the scam hand spent around $10 (USD) in form of TRX tokens.
Please be cautious. These days, cryptocurrencies are making trends since most of the markets are in a good shape. A good market calls for scammers.
Regards.
  • 1 reply
·
replied to their post 7 days ago
view reply

Well dear @wisesniper, honestly 2 years is a very long time for me. May I ask for a refund? :)

replied to their post 8 days ago
view reply

Well, I just purchased a miner. What's next?

posted an update 9 days ago
view post
Post
1478
Dear AI developers, have you managed to get any grants or compute powers for your research or startup?
If yes, from where and how?
  • 8 replies
·
reacted to abhishek's post with 🔥 9 days ago
view post
Post
4853
INTRODUCING Hugging Face AutoTrain Client 🔥
Fine-tuning models got even easier!!!!
Now you can fine-tune SOTA models on all compatible dataset-model pairs on Hugging Face Hub using Python on Hugging Face Servers. Choose from a number of GPU flavors, millions of models and dataset pairs and 10+ tasks 🤗

To try, install autotrain-advanced using pip. You can ignore dependencies and install without --no-deps and then you'd need to install some dependencies by hand.

"pip install autotrain-advanced"

Github repo: https://github.com/huggingface/autotrain-advanced
  • 6 replies
·
posted an update 14 days ago
posted an update 16 days ago
view post
Post
2568
Hey guys.
This is my first post here on huggingface. I'm glad to be a part of this amazing community!
  • 2 replies
·