Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey, I Have Pipeline From Code, But Have Problem With Caching. Actually Clearml Didn'T Cache Already Executed Steps (Tried To Re-Run Pipeline From Web-Ui). Did I Miss Something?

Hey, I have pipeline from code, but have problem with caching. Actually clearml didn't cache already executed steps (tried to re-run pipeline from web-ui). Did I miss something?

pipe.add_function_step(
    name='download_data',
    function=download_data_step,
    function_kwargs=dict(
        dataset_version='${pipeline.dataset_version}',
        file_name='${pipeline.file_name}',
    ),
    function_return=['data_path'],
    cache_executed_step=True,
    repo=repo,
    repo_branch=repo_branch,
    working_dir=working_dir
)

Also, I have problem with pipeline execution on local machine, problem with imports. Looks like clearml do not know for imports of tasks (in my example task1.py imports utils.some_utils_functions.py). If I add absolute path of utils to sys.path it works, but I do not like this solution. The problem has gone if I execute pipeline on remote machine when I add git repo, branch and working dir as arguments to add_function_step function, but didn't manage to solve on local machine. Here is my dir structure:

tasks

  • utils- some_utils_functions.py- task1.py
  • task2.pypipeline_controller.py
pipe.start_locally(run_pipeline_steps_locally=True)
  
  
Posted 3 months ago
Votes Newest

Answers 3


Hi @<1702492411105644544:profile|YummyGrasshopper29> ! To enable caching while using a repo , you also need to specify a commit (as the repo might change which would invalidate the caching). We will add a warning regarding this in the near future.
Regarding the imports: we are aware that there are some problems when executing the pipeline remotely as described. At the moment, appending to sys.path is one of the only solutions (other than making utils a package on your local machine so it can be imported from anywhere). We will look into this as well asap

  
  
Posted 3 months ago

Thanks a lot! I do not have problem executing the pipeline remotely, I have problam executing it locally.

  
  
Posted 3 months ago

@<1702492411105644544:profile|YummyGrasshopper29> you could try adding the directory you are starting the pipeline with to the python path. then you would run the pipeline like this:

 PYTHONPATH="${PYTHONPATH}:/path/to/pipeline_dir" python my_pipeline.py
  
  
Posted 3 months ago
303 Views
3 Answers
3 months ago
3 months ago
Tags