Hi @<1599211868868579328:profile|StalePigeon24> , you do need to have a ClearML server running to use the serving. However, ClearML is super modular so you don't really need to use the various modules to use the serving, it's enough just to spin the server up and have a model registered to start serving it.
Into what difficulties did you run into in regards to the ClearML server?
Oh the other issue I had was the setup used localhost and I had to change it to my real hostname so that name resolution would function across different docker compose (core , serving)
I’d really prefer it was modular enough to use serving with any model registry
Oh that's interesting. To serve a model from MLflow, would you have to copy it over to ClearML first?
The docs also are a little confusing in that part of them have you setup serving in docker compose but in the tutorial it launches serving in standalone container
But in terms of serving, I’d really prefer it was modular enough to use serving with any model registry
The install was easy . I really like the local dev setup . Only issue was port conflicts . Not sure why serving and the core server are set to have the same ports 8080
The assumption is that the server and serving don't run on the same machine. The ClearML server is just a control plane whereas the serving solution actually does computation.