3
retoor
24h

I made an Ollama hub where you can share your Ollama resources with others to be used.

Long story, see https://ollama.molodetz.nl for information.

Uses of the API can just use their default api clients! For security people can only call chat completions api on the shared resources. Content gets validated before forwarded to your Ollama instance if you're a host.

I hope you guys like this concept. Donate your server!

See model availability here: https://ollama.molodetz.nl/models

Comments
  • 2
    That is actually really cool. I like resource pooling. I may donate some at some point.

    The model list is currently empty though, broken or just offline?

    The only issue security wise is that the host can read the conpletion Traffic. Especially since ollama is open source you can hack your own client to even log all the messages to disk if you want. I built my own ollama client before, It's a well put together project so anyone can do it. :D but for things you don't care about It's always a good idea
  • 4
    @Hazarth Oh, please check again. It's working nice now. I'm dogfeeding.

    Yes, the hosts have complete insight what is requested. Not much to do about it I guess. But also, since you'll never know who the requester is, what does it matter.

    I would be very happy with donation! :) Imagine if a lot of people did this, we could make AI 'free' for everyone :).

    Resource pooling, nice term. Should've used that.
  • 2
    @Hazarth Same problem as all providers TBH (e.g. OpenAI, Claude)

    @retoor Who knows, maybe you'll end up with a sponsor and get a nice server with a nice GPU. I think it would be cool for there to be some form of communal thing like this, maybe reach out to the ollama/hugging face people as well?
Add Comment