Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey, Don'T Really Understand Why The Clearml Worker Needs To Pull The Repository Where My Pipeline (Defined With Decorators) Is Written Is Since Apparently A Temporary Python File (Containing At Least The Code And Imports For The Executed Component) Seems

Hey, don't really understand why the ClearML worker needs to pull the repository where my pipeline (defined with decorators) is written is since apparently a temporary python file (containing at least the code and imports for the executed component) seems to be uploaded upon creation of the task

  
  
Posted 2 years ago
Votes Newest

Answers 10


Okay the force_store_standalone_script() works

  
  
Posted 2 years ago

Well its not working, this params seems to be used to override the repo to pull since it has a str type annotation anyway, ClearML still attempted to pull the repo

  
  
Posted 2 years ago

Oh I see the pipeline controller itself (not the components) is the one with the repo
To fix that add at the top of the script the following:
` from clearml import Task

Task.force_store_standalone_script()

@PipelineDecorator.pipeline(...) `That should do the trick

  
  
Posted 2 years ago

And Ithen can override it by specifying a repo on one of the components ?

  
  
Posted 2 years ago

Ah thank you I'll try that ASAP

  
  
Posted 2 years ago

Hi FierceHamster54
Are you saying the pipeline component is a standalone script?
If this is the case then you are correct, it should not need to, I think you can specify it in the decorator.
I think this might work 🤞
@PipelineDecorator.component(..., repo=False)

  
  
Posted 2 years ago

Well aside from the abvious removal of the line PipelineDecorator.run_locally() on both our sides, the decorators arguments seems to be the same:
@PipelineDecorator.component( return_values=['dataset_id'], cache=True, task_type=TaskTypes.data_processing, execution_queue='Quad_VCPU_16GB', repo=False )And my pipeline controller:
@PipelineDecorator.pipeline( name="VINZ Auto-Retrain", project="VINZ", version="0.0.1", pipeline_execution_queue="Quad_VCPU_16GB" )

  
  
Posted 2 years ago

(if for instance in wanna pull a yolov5 repo in the retraining component)

  
  
Posted 2 years ago

Ah nice, thanks ❤

  
  
Posted 2 years ago

This is odd I was running the example code from:
https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py
It is stored inside a repo, but the steps that are created (i.e. checking the Task that is created) do not have any repo linked to them.
What's the difference ?

  
  
Posted 2 years ago
1K Views
10 Answers
2 years ago
one year ago
Tags