Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
What Is The Difference Between Model And Inputmodel?

What is the difference between Model and InputModel?

  
  
Posted one year ago
Votes Newest

Answers 9


Hi @<1523704157695905792:profile|VivaciousBadger56> , you can read about the differences here:
None
None

  
  
Posted one year ago

@<1523701070390366208:profile|CostlyOstrich36> I read those, but did not understand.

  
  
Posted one year ago

As far as I understand, the workflow is like this. I define some model. Then I register it as an OutputModel. Then I train it. During training I save snapshots (not idea how, though) and then I save the final model when training is finished. This way the Model is a) connected to the task and b) available in the model store of ClearML.

Later, in a different task, I can load an already trained model with InputModel. This InputModel is read-only (regarding the ClearML model store), but I can make a copy on which I work and - e.g. - further train. So, what I would think, I get the model with InputModel from the model store, make a copy, register the copy with my new task as an OutputModel and then go on as I did in the last paragraph.

No idea how the Model comes into play here. Let me compare InputModel with Model:

class Model()
Represent an existing model in the system, search by model id. The Model will be read-only and can be used to pre initialize a network

class InputModel()
Load an existing model in the system, search by model ID. The Model will be read-only and can be used to pre initialize a network. We can connect the model to a task as input model, then when running remotely override it with the UI.
Load a model from the Model artifactory, based on model_id (uuid) or a model name/projects/tags combination.

Sounds just the same to me. What is the difference, @<1523701070390366208:profile|CostlyOstrich36> ?

  
  
Posted one year ago

Also, I could not find any larger examples on github about Model, InputModel, or OutputModel. It's kind of difficult to build a PoC this way... 😅

  
  
Posted one year ago

I think that Model is used to do general actions as allowed by the SDK. InputModel is for an easier interface when working with the Task object directly.

What is your use case?

  
  
Posted one year ago

@<1523701070390366208:profile|CostlyOstrich36> , I am build a PoC, evaluating if we should use ClearML for our entire ML team and go Scale or Enterprise pricing. For that I need to know all/most capabilities and concepts of ClearML to see if ClearML is future-proof.

TL;DR: difficult to narrow it down, but we (amongst other things), we need a model store

  
  
Posted one year ago

@<1523704157695905792:profile|VivaciousBadger56> , ClearML's model repository is exactly for that purpose. You can basically use InputModel and OutputModel for handling models in relation to tasks

  
  
Posted one year ago

@<1523701070390366208:profile|CostlyOstrich36> : Thanks, where can I find more information on ClearML's model repository. I hardly find any in the documentation.

Also, that leaves the question open, what Model is for. I described how I understand, the workflow should look like, but my question remains open...

  
  
Posted one year ago

@<1523701070390366208:profile|CostlyOstrich36> : After more playing around, it seems that ClearML Server does not store the models or artifacts itself. These are stored somewhere else (e.g., AWS S3-bucket) or on my local machine and ClearML Server is only storing configuration parameters and previews (e.g., when the artifact is a pandas dataframe). Is that right? Is there a way to save the models completely on the ClearML server?

  
  
Posted one year ago