Hi,
When I try to execute pipeline remotely (in a docker container, triggered from clearml UI) i get an error ' Node '{}', base_task_id is empty".format(node.name) ' if pipeline itsel and pipeline component are in different files. This issue does not occure when I run pipeline locally. Here is my setup:
file with pipeline
from clearml.automation.controller import PipelineDecorator
from clearml import TaskTypes
@PipelineDecorator.pipeline(name="testagent", project="examples", version="0.0.1")
def executing_pipeline(param_1):
    from clearml import Task 
    from src.testagentcomponent import step_one
    task = Task.current_task()
    Task.execute_remotely(task, queue_name='default')
    b = step_one(param_1)
def main():
    PipelineDecorator.run_locally()
    executing_pipeline(
        param_1="asd"
    )
if __name__ == "__main__":
    main()
file with component
from clearml.automation.controller import PipelineDecorator
from clearml import TaskTypes
@PipelineDecorator.component(return_values=["n"], cache=True, task_type=TaskTypes.data_processing)
def step_one(param: str):
    from clearml import Task 
    task = Task.current_task()
    Task.execute_remotely(task, queue_name='default')
    print("step_one", param)
    return 123
I have tried passing task.id from pipeline to component and instead of Task.current_task() in the component used following variations, but got the same error
- Task.init(base_task_id=task_id)
 
- Task.get_task(task_id=task_id)
When I move component to the same file as pipeline the issue resolves. But to keep my code organized I would prefer to have pipeline and components in different files. Is there a way to do so? 
And another question, if I remove Task.execute_remotely() from the component, then I get an error  'Node 'step_one' missing execution queue, no default queue defined and no specific node queue defined'.  Is there a way to define from clearml UI queue that shall be used for each pipeline node instead?
Thank you in advance!