Hi @<1577106212921544704:profile|WickedSquirrel54>
We are self hosting it using Docker Swarm
Nice!
and were wondering if this is something that the community would be interested in.
Always!
what did you have in mind? I have to admit I'm not familiar with the latest in Docker swarm but we all lover Docker the product and the company
We would "donate" back to the community a docker stack template that can be used to set up the community edition. We have been running this on our Hetzner Docker Swarm for something of a year or so now? Not much in terms of HA for mongodb or Elastic Search, but we get "high enough" availability with this setup.
also i wanna say that clearml is by far the least insane offering in the mlops space. and that's a good thing. no weird marriage between kubernetes and the platform just because. I applaud that design decision
We would "donate" back to the community a docker stack template that can be used to set up the community edition.
Perfect, feel free to PR to the clearml-server repository, we can take it from there
🙏 🙏 😍
I just realized that we didn't deploy the fileserver in our deployment at all and instead enforce usage of s3 as we didn't see the need for it. We might have to clarify that in that stack though then
yeah, I was just wondering why we did that and whether other people might want to do the same
that really depends on hoe much data you have there, and the setup. The upside of the file server is you do not need to worry about credentials, the downside is storage is more expensive
Hey @<1577106212921544704:profile|WickedSquirrel54> , I would definitely be interested in this. A gist would be cool too
What I could do until we have the time to do this properly:
I could dump the whole thing but with all our credentials removed. As I said, we removed some things and some new components we are not using yet