Post
3379
š I'm excited to announce that huggingface_hub's InferenceClient now supports OpenAI's Python client syntax! For developers integrating AI into their codebases, this means you can switch to open-source models with just three lines of code. Here's a quick example of how easy it is.
Why use the InferenceClient?
š Seamless transition: keep your existing code structure while leveraging LLMs hosted on the Hugging Face Hub.
š¤ Direct integration: easily launch a model to run inference using our Inference Endpoint service.
š Stay Updated: always be in sync with the latest Text-Generation-Inference (TGI) updates.
More details in https://huggingface.co/docs/huggingface_hub/main/en/guides/inference#openai-compatibility
Why use the InferenceClient?
š Seamless transition: keep your existing code structure while leveraging LLMs hosted on the Hugging Face Hub.
š¤ Direct integration: easily launch a model to run inference using our Inference Endpoint service.
š Stay Updated: always be in sync with the latest Text-Generation-Inference (TGI) updates.
More details in https://huggingface.co/docs/huggingface_hub/main/en/guides/inference#openai-compatibility