Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey, Clearml Team! When Can We Expect An Updated Roadmap? Last One Is From August

Hey, ClearML team! When can we expect an updated roadmap? Last one is from august https://docs.google.com/document/d/1QlPiDO2EzDq_HRvuVhYwPh5F_IsjgmmRTlPNdHik10k/edit
I’m specifically interested in clearml-serving part - the repo has been stale for ~4 months. We are currently evaluating Seldon/KServe(aka KFServing) as a serving engine, and I would be glad to test it together with clearml-serving

  
  
Posted 3 years ago
Votes Newest

Answers 13


automatically promote models to be served from within clearml

Yes!

  
  
Posted 3 years ago

As for your question, yes, our effort was diverted into other avenues and not a lot of public progress has been made.
That said, what is your plan for integration of the tools? automatically promote models to be served from within clearml?

  
  
Posted 3 years ago

That would be amazing!

  
  
Posted 3 years ago

we are just entering the research phase for a centralized serving solution. Main reasons against clearml-serving triton are: 1) no support for kafka 2)no support for shadow deployments (both of these are supported by Seldon, which is currently the best=looking option for us)

  
  
Posted 3 years ago

like replace a model in staging seldon with this model from clearml; push this model to prod seldon, but in shadow mode

  
  
Posted 3 years ago

Thanks FiercePenguin76
We will update the roadmap and go into details on the next community Talk (in a week from now, I think)
Regrading clearml-serving, Yes! we are actively working on it internally, but we would love to get some feedback, I thinkΒ  AnxiousSeal95 Β would appreciate itΒ  πŸ˜‰

  
  
Posted 3 years ago

in the far future - automatically. In the nearest future - more like semi-manually

  
  
Posted 3 years ago

I am also interested in the clearml-serving part πŸ˜„

  
  
Posted 3 years ago

FiercePenguin76 Thanks! That's great input! If you're around tomorrow, feel free to ask us questions in our community talk! We'd be happy to discuss πŸ˜„

  
  
Posted 3 years ago

JitteryCoyote63 Fair point πŸ˜… , I'll be lying to say we haven't been slow on documenting new features πŸ™‚ That being said, since what you're looking for seems REALLY straightforward (at least to people who know how it works internally πŸ˜› ) we can probably do something about it rather quickly πŸ™‚

  
  
Posted 3 years ago

AnxiousSeal95 The main reason for me to not use clearml-serving triton is the lack of documentation tbh πŸ˜„ I am not sure how to make my pytorch model run there

  
  
Posted 3 years ago

FiercePenguin76 JitteryCoyote63 are you guys using clearml-serving triton at the moment? If not, would be happy to hear what are the barriers for usage πŸ™‚

  
  
Posted 3 years ago

Hi Jevgeni! September is always a slow month in Israel as it's holiday season πŸ™‚ So progress is slower than usual and we didn't have an update!
Next week will be the next community talk and publishing of the next version of the roadmap, a separate message will follow

  
  
Posted 3 years ago
1K Views
13 Answers
3 years ago
one year ago
Tags