Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Have One Doubt Related To Pipeline I Have One Pipeline With Eg 3 Tasks, Preprocess, Train And Test Now I Want To Clone The Pipeline And Change The Hyperparameters Of Train Task, Is It Possible? If So, How??

Hi, I have one doubt related to pipeline
I have one pipeline with eg 3 tasks, preprocess, train and test

Now i want to clone the pipeline and change the hyperparameters of train task, is it possible? If so, how??

  
  
Posted one year ago
Votes Newest

Answers 12


Hi @<1585078763312386048:profile|ArrogantButterfly10>

Now i want to clone the pipeline and change the hyperparameters of train task, is it possible? If so, how??

the pipeline arguments are for the pipeline DAG/logic, you need to pass one of the arguments as an argument for the training step/task. Make sense ?

  
  
Posted one year ago

@<1523701205467926528:profile|AgitatedDove14>
Clearml version - 1.12.1

In the pipeline example None this specifically, at line 83
parameter_override={'General/num_boost_round': 250,
'General/test_size': 0.5,
'General/random_state': random_state}
these are fixed and cannot be changed by user when the pipeline is cloned, I am trying to make these parameters dynamic using .add_parameter method.

So, every task of has some hyperparameters,
eg task1 takes some int as it will filter out data
task2 takes some NN based params like optimizer, activation func

These params I am passing as whole, i mean as dictionary in my function to set parameters.
But apparently when I am cloning the pipeline and giving the inputs, it is not changing to those and running on original pipeline's inputs.
eg in the base task, epoch was 10, when I cloned the code I set the epoch to 5, it did not changed to 5

  
  
Posted one year ago

anyway this got resolved\

  
  
Posted one year ago

Image 1 shows original pipeline, image 2 show cloned pipeline and image 3 show parameters in cloned pipeline's run
image
image
image

  
  
Posted one year ago

@<1585078763312386048:profile|ArrogantButterfly10> could it be that in the "base task" of the pipeline step, you do not have any hyper-parameter ? (I mean the Task that the pipeline clones and is supposed to set new hyperparameters for...)

  
  
Posted one year ago

How are you building your pipeline?
None
None

  
  
Posted one year ago

I will check

  
  
Posted one year ago

Hi @<1585078763312386048:profile|ArrogantButterfly10> , can you please try with the new ClearML SDK v1.12.2rc0?

  
  
Posted one year ago

It resolved.
I was doing 2 things wrong, defining params before task.init and while using task.connect(params) I was naming it and trying to set params using General/param_name

thanks for the support

  
  
Posted one year ago

like this.. But when I am cloning the pipeline and changing the parameters, it is running on default parameters, given when pipeline was 1st run

Just making sure, you are running the cloned pipeline with an agent. correct?
What is the clearml version you are using?
Is this reproducible with the pipeline example ?

  
  
Posted one year ago

@<1523701205467926528:profile|AgitatedDove14> I am a bit lost, can you elaborate?

  
  
Posted one year ago

@<1523701205467926528:profile|AgitatedDove14>
I am using pipeline by task and using pipe.add_parameter method to add the parameter through ui

pipe.add_parameter('random_state',2024) #model training
pipe.add_parameter('epochs',10)
pipe.add_parameter('learning_rate',0.001)

and then overriding the parameters using parameter_overide

pipe.add_step(
name='model_training',
parents=['preprocess_data'],
base_task_project=global_config.PROJECT_NAME,
base_task_name='model training',
parameter_override={'General/random_state': '${pipeline.random_state}',
'General/epochs': '${pipeline.epochs}',
'General/learning_rate': '${pipeline.learning_rate}'

like this.. But when I am cloning the pipeline and changing the parameters, it is running on default parameters, given when pipeline was 1st run

  
  
Posted one year ago
1K Views
12 Answers
one year ago
one year ago
Tags