Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey All. Is There A Best Practice Approach To Deploying Models Trained In Clearml? Does Anyone Have A Standard Workflow That They Employ?

Hey all. Is there a best practice approach to deploying models trained in clearml? Does anyone have a standard workflow that they employ?

  
  
Posted 3 years ago
Votes Newest

Answers 5


Is this information stored anywhere or do I need to explicitly log this data somehow?

On the creating Task along side all the other reports.
Basically each model stores its creating Task (Task ID), using the Task ID you can query all the metrics reported by the task

  
  
Posted 3 years ago

While we're here, how can I return the model accuracy (or any performance metric for that matter) given a model(s) belonging to a particular task? Is this information stored anywhere or do I need to explicitly log this data somehow?

  
  
Posted 3 years ago

Hi TenseOstrich47
You can check the new clearml-serving , and the new python interfaces added to the "Model" class.
https://github.com/allegroai/clearml/blob/22d795f68f0175ba9511cabd444ea4dba464f3cd/clearml/model.py#L444

  
  
Posted 3 years ago

In particular, I am trying to find a neat way to query all models available, and use tags to know the context. As it stands, I log the model accuracies/RMEs as part of the metadata, alongside the training data filepath. Issue is that this is not the neatest way of querying models across tasks without a lot of laborious manual lifting. Suggestions welcome

  
  
Posted 3 years ago

Thanks maestro. Will give this a go

  
  
Posted 3 years ago
1K Views
5 Answers
3 years ago
one year ago
Tags