Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey All! Ive Gone Through The Doco And Not Found Anything At The Moment, But Does Clearml Have Model Versioning And Staging (Similar To Mlflow).

Hey all! Ive gone through the doco and not found anything at the moment, but does ClearML have model versioning and staging (similar to mlflow). GrumpyPenguin23 this came up in convo before, but I know I can grab the last task with a given name and extract the model from that, but is there a way of tagging/versioning models that I’ve missed, or is it all Task based?

  
  
Posted 3 years ago
Votes Newest

Answers 5


Ah fantastic, thanks! Another one for me - is there support for custom python models at all? For example, dummy models that simply return the output of an equation run over the dataframe after transforming some of the input columns. Something similar to mlflows custom pyfunc that allows a standard way of interfacing with custom models as you do with keras/sklearn/pytorch models

  
  
Posted 3 years ago

Hi, I think this came up when we discussed the joblib integration right? We have a model registry, ranging from auto spec to manual reporting. E.g. https://allegro.ai/clearml/docs/docs/examples/frameworks/pytorch/manual_model_upload.html

  
  
Posted 3 years ago

Hi LudicrousParrot69
Not sure I follow, is this pyfunc running remotely ?
Or are you looking for interfacing with previously executed Tasks ?

  
  
Posted 3 years ago

LudicrousParrot69
I "think" I have a better handle on what you wish to do.
Is it kind of generic "serving" solution?
FYI:
Model artifact is, usually, a weights/model file. The idea that later you will be able to access it and serve it. Now the problem is (and I think this is what you are referring to) there is usually a specific piece of code tied to that model that can use it (a.k.a pyfunc)
A few ideas:
These days everyone is trying to build their models with generic interface, so that scikit learn will be able to serve any model it was storing. (tf serving and pytorch script are of similar nature). If this is the case the Model framework could be used in order to detect which of these frameworks is to be used (this could actually be done in runtime) You could pickle the function itself and store it as a second artifact (basically upload_artifcat could auto_pickle it for you). That said, pickling is quite fragile and you have to have all the function dependencies in order to unpickle it.WDYT?

BTW: on a diff note, the auto-archive of the HPO will probably be in the nest version due in a few days 😉

  
  
Posted 3 years ago

Yeah its trying to plan down the line into model deployment. Whilst its easy to save out a keras SavedModel or similar and have that artifact uploaded into the store, just wanted to check if there was a more generic solution. I could just create a Python class and serialise that out such that it has a standard interface, but good to check. So for example, some artifact representing an arbitrary math function. For better context, the idea is to make deploying any artifact we upload using clearml as easy as possible. Back in my last project, which was airflow+mlflow, all models were executable using a standard interface (pyfunc), and making a custom model which was interfaced with the same way as sklearn/keras/ and thus deployed/served the same way, was done by extending PythonModel ( https://www.mlflow.org/docs/latest/models.html#example-creating-a-custom-add-n-model ). Im trying to get the hang of how to do similar things with ClearML, and have been over the docs in clearml.model.Model but this doesnt seem what I want - which is to be able to get a Task’s model and run using it, with bonus points if I dont have to care about what the model itself is

  
  
Posted 3 years ago
1K Views
5 Answers
3 years ago
one year ago
Tags