Btw, found a temporary solution for OAI compatible inference server users: <https://github.com/Embedded-Nature/ollama-proxy> This thing translates Ollama API calls to OAI and other way, with OAI side being the model

Quoted:

Message:
Btw, found a temporary solution for OAI compatible inference server users: https://github.com/Embedded-Nature/ollama-proxy

This thing translates Ollama API calls to OAI and other way, with OAI side being the model.

Timestamp:
2025-03-28T21:44:16.279000+00:00

Attachment:

Discord Message ID:
1355296460102701057