FiercePenguin76 Thanks! That's great input! If you're around tomorrow, feel free to ask us questions in our community talk! We'd be happy to discuss π
we are just entering the research phase for a centralized serving solution. Main reasons against clearml-serving triton are: 1) no support for kafka 2)no support for shadow deployments (both of these are supported by Seldon, which is currently the best=looking option for us)
FiercePenguin76 JitteryCoyote63 are you guys using clearml-serving triton at the moment? If not, would be happy to hear what are the barriers for usage π
JitteryCoyote63 Fair point π , I'll be lying to say we haven't been slow on documenting new features π That being said, since what you're looking for seems REALLY straightforward (at least to people who know how it works internally π ) we can probably do something about it rather quickly π
As for your question, yes, our effort was diverted into other avenues and not a lot of public progress has been made.
That said, what is your plan for integration of the tools? automatically promote models to be served from within clearml?
Hi Jevgeni! September is always a slow month in Israel as it's holiday season π So progress is slower than usual and we didn't have an update!
Next week will be the next community talk and publishing of the next version of the roadmap, a separate message will follow
like replace a model in staging seldon with this model from clearml; push this model to prod seldon, but in shadow mode
AnxiousSeal95 The main reason for me to not use clearml-serving triton is the lack of documentation tbh π I am not sure how to make my pytorch model run there
I am also interested in the clearml-serving part π
in the far future - automatically. In the nearest future - more like semi-manually
Thanks FiercePenguin76
We will update the roadmap and go into details on the next community Talk (in a week from now, I think)
Regrading clearml-serving, Yes! we are actively working on it internally, but we would love to get some feedback, I thinkΒ AnxiousSeal95 Β would appreciate itΒ π
automatically promote models to be served from within clearml
Yes!