Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Currently, I’M Manually Tracking Multiple Output Models At The End Of The Training Process

Currently, Iā€™m manually tracking multiple output models at the end of the training process
model_paths = list(Path(checkpoint_dir).absolute().glob('*')) for model_path in model_paths: output_model = OutputModel(task=task, name=f'{args.task_name}-{model_path.stem}', framework='PyTorch') output_model.update_weights(weights_filename=str(model_path), target_filename=model_path.stem) # task.update_output_model(model_path=str(model_path), # model_name=f'{args.task_name}-{model_path.stem}')The problem with this approach is when I publish my experiment, only one (the first or the last OutputModel, idk) of all logged models is published, the others remain Draft .
Is this a feature of ClearML? If not, how could I fix it? Thanks!

  
  
Posted 2 years ago
Votes Newest

Answers 5


I see... Well, currently this will only publish the latest model. You can publish the other models programmatically using something like:
for model in task.models.output[:-1]: model.publish()

  
  
Posted 2 years ago

Hi GrittyKangaroo27 ,
By default, ClearML publishes the last output model when publishing a task - the reasoning it that this is the latest model and represents the result.
Do you publish you task programmatically (i.e. by calling task.publish() )?

  
  
Posted 2 years ago

If the use-case is important to you, please open a GitHub issue with a feature request šŸ™‚

  
  
Posted 2 years ago

I publish the task with UI interface

  
  
Posted 2 years ago

Thanks!

  
  
Posted 2 years ago