Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I'M Following The Pipeline Controller Example...This Is The Output I Get After Running The The Three Scripts For Step1, Step2, And Step3, And Finally The

I'm following the pipeline controller example...this is the output I get after running the the three scripts for step1, step2, and step3, and finally the pipeline_controller.py :

(ml-test) tglema@mvd0000xlrndtl2 pipeline

git:(master) ✗

python step1_dataset_artifact.py
ClearML Task: created new task id=a8035abd83f94bf6af1e4e9f9f6e95a4
ClearML results page:

2021-01-27 15:15:16,447 - clearml.Task - INFO - Waiting for repository detection and full package requirement analysis
2021-01-27 15:15:16,477 - clearml.Task - INFO - Finished repository detection and package analysis
2021-01-27 15:15:16,714 - clearml - WARNING - Terminating local execution process
(ml-test) tglema@mvd0000xlrndtl2 pipeline

git:(master) ✗

vim step2_data_processing.py
(ml-test) tglema@mvd0000xlrndtl2 pipeline

git:(master) ✗

python step2_data_processing.py
ClearML Task: created new task id=6f17d51b00e5434b9e6c5da7983c5de6
ClearML results page:

Arguments: {'dataset_task_id': 'a8035abd83f94bf6af1e4e9f9f6e95a4', 'dataset_url': '', 'random_state': 42, 'test_size': 0.2}
2021-01-27 15:16:17,522 - clearml.Task - INFO - Waiting for repository detection and full package requirement analysis
2021-01-27 15:16:17,535 - clearml.Task - INFO - Finished repository detection and package analysis
2021-01-27 15:16:17,801 - clearml - WARNING - Terminating local execution process
(ml-test) tglema@mvd0000xlrndtl2 pipeline

git:(master) ✗

vim step3_train_model.py
(ml-test) tglema@mvd0000xlrndtl2 pipeline

git:(master) ✗

python step3_train_model.py 
ClearML Task: created new task id=72e5ea40917c4f2d8fb97a741f9ca177
ClearML results page:

2021-01-27 15:17:58,738 - clearml.Task - INFO - Waiting for repository detection and full package requirement analysis
2021-01-27 15:17:58,743 - clearml.Task - INFO - Finished repository detection and package analysis
2021-01-27 15:17:58,980 - clearml - WARNING - Terminating local execution process
(ml-test) tglema@mvd0000xlrndtl2 pipeline

git:(master) ✗

python pipeline_controller.py 
ClearML Task: created new task id=3a4d1cc089454b0086ab62b79a2374a0
ClearML results page:

Launching step: stage_data
Parameters:
None
ClearML Monitor: Could not detect iteration reporting, falling back to iterations as seconds-from-start
ClearML Monitor: Reporting detected, reverting back to iteration based reporting

and just idles there

  
  
Posted 3 years ago
Votes Newest

Answers 7


are those warnings ok?

  
  
Posted 3 years ago

it does! thanks! I thought I had to modify the scripts, but now I see that it does it with the parameter_override

  
  
Posted 3 years ago

Yep 🙂

  
  
Posted 3 years ago

MagnificentSeaurchin79
Do notice that the pipeline controller assumes you have an agent running

  
  
Posted 3 years ago

I missed that part! sorry
One question, what do I put in the task_id in the step2 file?
because once it clones task of step1, the task_id changes, so it no longer points to the actual task that was run

  
  
Posted 3 years ago

FYI...I am able to run the three tasks by commenting he task.execute_remotely() lines in each file

  
  
Posted 3 years ago

Oh task_id is the Task ID of step 2.
Basically the idea is, you run your code once (lets call it debugging / programming), that run creates a task in the system, the task stores the environment definition and the arguments used. Then you can clone that Task and launch it on another machine using the Agent (that basically will setup the environment based on the Task definition and will run your code with the new arguments). The Pipeline is basically doing that for you (i.e. cloning a task changing parameters and enqueuing the task for execution).
This means the base_task_id parameter is the Task that the step will clone and enqueue for execution. Make sense ?

  
  
Posted 3 years ago