Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Hi There, I Have A Batch Prediction Task That Load A Model Published On Clearml.

Hi there,
I have a batch prediction Task that load a model published on ClearML.
input_model = InputModel(model_id=model_id) model_path = input_model.get_local_copy()The problem is that every time it finish the batch prediction, ClearML upload the model again (see the image)
This creates a lot of entries in the models page.
How can I avoid this?

Posted one year ago
Votes Newest

Answers 2

Hi IrritableGiraffe81
Can you share a code snippet ?
Generally I would try
task = Task.init(..., auto_connect_frameworks={"pytorch': False, 'tensorflow': False)

Posted one year ago

Thanks Martin, your suggestion solves the problem.

Posted one year ago
2 Answers
one year ago
one year ago