0

OVHcloud on Hugging Face Inference Providers 🔥

https://huggingface.co/blog/OVHcloud/inference-providers-ovhcloud(huggingface.co)
OVHcloud is now a supported Inference Provider on the Hugging Face Hub, expanding the options for serverless inference on popular open-weight models. This integration allows users to access models like Llama and Qwen3 directly from model pages or via Python and JavaScript SDKs. The OVHcloud AI Endpoints service is a fully managed, serverless platform running on European infrastructure, offering pay-per-token pricing and low latency. Users can make API calls directly with an OVHcloud key or have them routed through their Hugging Face account for billing. The system supports advanced features like structured outputs and function calling for both text and image processing.
0 points•by hdt•3 days ago

Comments (0)

No comments yet. Be the first to comment!

Want to join the discussion?