Quoted:
But even if that is not the case - our current AI service implementation is not written in such a way that would cater for generic models served by Ollama. Unfortunately, changes are required to cater for the behaviour for different models.
Message:
Quick question if say I do have an account on appflowy cloud, but I compile my own binary of appflowy that would have local LLM support. Does that break any TOS of me using the free tier but having local LLM access?
Timestamp:
2025-01-22T13:10:19.281000+00:00
Attachment:
Discord Message ID:
1331611909094641778