Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Have An Inference Task In Clearml Where I Apply A Model (Defined In Input Params) To A Dataset. Clearml Registers The Model As An Input Model, Which Is Nice. But When I Clone The Task And Modify Input Param To Apply Another Model To The Same Dataset, Th

I have an inference task in Clearml where I apply a model (defined in input params) to a dataset. Clearml registers the model as an input model, which is nice. But when I clone the task and modify input param to apply another model to the same dataset, then everything is done correctly except for the fact that model from origin task and not the new one is registered as an input model. Is it a bug or feature? Is there an easy workaround (e.g. clean all of the input models at the start of the task)?

  
  
Posted one year ago
Votes Newest

Answers 15


no, I’m providing the id of task which generated the model as a “hyperparam”

  
  
Posted one year ago

FiercePenguin76 Are you changing the model by pressing the circled button in the first photo? Are you promted with a menu like in the second photo?

  
  
Posted one year ago

I had a bunch of training tasks each of which outputted a model. I want to apply each one of them to a specific dataset. I have a clearml task ( apply_model ) for that, which takes dataset_id and model-producing task_id as input. First time I initiate apply model by hardcoding ids and starting the run from my machine (it is then goes into cloud, when it reaches execute_remotely )

  
  
Posted one year ago

in cloned tasks, the correct model is being applied, but the original one stays registered as input model

  
  
Posted one year ago

thanks for all those precisions. I will try to reproduce and keep you updated 🙂

  
  
Posted one year ago

all subsequent invocations are done by cloning this task in UI and changing the model task_id

  
  
Posted one year ago

MotionlessCoral18 If you provide the model as a hyperparam, then I believe you should query its value by calling https://clear.ml/docs/latest/docs/references/sdk/task/#get_parameters or https://clear.ml/docs/latest/docs/references/sdk/task/#get_parameter

  
  
Posted one year ago

clearml==1.5.0
WebApp: 1.5.0-192 Server: 1.5.0-192 API: 2.18

  
  
Posted one year ago

FiercePenguin76 Looks like there is actually a bug when loading models remotely. We will try to fix this asap

  
  
Posted one year ago

I tried this, but didn’t help:
input_models = current_task.models["input"] if len(input_models) == 1: input_model_as_input = {"name": input_models[0].name, "type": ModelTypeEnum.input} response = current_task.send(DeleteModelsRequest( task=current_task.task_id, models=[input_model_as_input] ))

  
  
Posted one year ago

SmugDolphin23 sorry I don’t get how this will help with my problem

  
  
Posted one year ago

also, I don’t see an edit button near input models

  
  
Posted one year ago

I am not registering a model explicitly in apply_model . I guess it is done automatically when I do this:
output_models = train_task_with_model.models["output"] model_descriptor = output_models[0] model_filename = model_descriptor.get_local_copy()

  
  
Posted one year ago

I guess you can easily reproduce it by cloning any task which has an input model - logs, hyperparams etc are being reset, but inputmodel stays.

  
  
Posted one year ago

hi FiercePenguin76
Can you also send your clearml packages versions ?
I would like to sum your issue up , so that you could check i got it right

you have a task that has a model, that you use to make some inference on a dataset you clone the task, and would like to make inferences on the dataset, but with another modelthe problem is that you have a cloned task with the first model....

How have you registered the second model ? Also can you share your logs ?

  
  
Posted one year ago