Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, When I Try To Execute Pipeline Remotely (In A Docker Container, Triggered From Clearml Ui) I Get An Error '

Hi,
When I try to execute pipeline remotely (in a docker container, triggered from clearml UI) i get an error ' Node '{}', base_task_id is empty".format(node.name) ' if pipeline itsel and pipeline component are in different files. This issue does not occure when I run pipeline locally. Here is my setup:

file with pipeline

from clearml.automation.controller import PipelineDecorator
from clearml import TaskTypes

@PipelineDecorator.pipeline(name="testagent", project="examples", version="0.0.1")
def executing_pipeline(param_1):
    from clearml import Task 
    from src.testagentcomponent import step_one
    task = Task.current_task()
    Task.execute_remotely(task, queue_name='default')
    b = step_one(param_1)

def main():
    PipelineDecorator.run_locally()
    executing_pipeline(
        param_1="asd"
    )

if __name__ == "__main__":
    main()

file with component

from clearml.automation.controller import PipelineDecorator
from clearml import TaskTypes

@PipelineDecorator.component(return_values=["n"], cache=True, task_type=TaskTypes.data_processing)
def step_one(param: str):
    from clearml import Task 
    task = Task.current_task()
    Task.execute_remotely(task, queue_name='default')
    print("step_one", param)
    return 123

I have tried passing task.id from pipeline to component and instead of Task.current_task() in the component used following variations, but got the same error

  • Task.init(base_task_id=task_id)
  • Task.get_task(task_id=task_id)
    When I move component to the same file as pipeline the issue resolves. But to keep my code organized I would prefer to have pipeline and components in different files. Is there a way to do so?

And another question, if I remove Task.execute_remotely() from the component, then I get an error 'Node 'step_one' missing execution queue, no default queue defined and no specific node queue defined'. Is there a way to define from clearml UI queue that shall be used for each pipeline node instead?

Thank you in advance!

  
  
Posted 5 months ago
Votes Newest

Answers 2


Hi @<1643060801088524288:profile|HarebrainedOstrich43>
I think I understand what's going on, in order for the pipeline logic to be "aware" of the pipeline component, it needs to be declared in the pipeline logic script file (or scope if you will).
Try to import from src.testagentcomponent import step_one also in the global pipeline script (not just inside the function)

  
  
Posted 5 months ago

Hi @<1523701205467926528:profile|AgitatedDove14> , thank you very much, it solved my issues!

  
  
Posted 5 months ago
341 Views
2 Answers
5 months ago
5 months ago
Tags
Similar posts