I'm curious why only Ollama has custom API?

Quoted:
Well, LM Studio uses OAI style API, so as I pointed out even further back, supporting custom OAI compatible APIs means compatibility with most of local inference servers, not just the one app. As far as I did search, only Ollama has custom API, and it seems they just opted to combine the management API with inference instead of using OAI compatible API originally. They are now adding OAI compatible endpoint, as it is what most apps support.

Message:
I’m curious why only Ollama has custom API?

Timestamp:
2025-03-19T15:37:20.156000+00:00

Attachment:

Discord Message ID:
1351942626999406593