@<1626028578648887296:profile|FreshFly37> can you share also logs of task ? It may give an idea.
I have attached the screenshot of logs earlier
@<1626028578648887296:profile|FreshFly37> how are you running this locally in the first place?
If you are running pipeline.py
with cwd as ev_xx_detection/clearml
, then I would not expect you to be able to do from ev_xx_detection.clearml import constants
(for example), but import constants
directly would work (as constants.py
is in the same directory as pipeline.py
). The reason your remote run doesn't work is basically because of this:
cwd is ev_xx_detection/clearml
and ev_xx_detection.clearml.constants
is imported, but the module that should be imported is actually constants
@<1523701435869433856:profile|SmugDolphin23> Sure, Thank you for the suggestion. I'll try to add imports as mentioned by you and execute the pipeline & check the functionality.
In Local I'm running using python3 pipelin.py
and used pipe.start_locally(run_pipeline_steps_locally=True)
in the pipeline to initialize & it's working fine.
@<1523701435869433856:profile|SmugDolphin23> I have tried another way by including pipeline.py in the root directory of the code and executed “python3 pipeline.py” & still faced same issue
@<1523701435869433856:profile|SmugDolphin23> I have tried the same method as suggested by you and the pipeline still failed, as it couldn't find "modules". Could you please help me here?
I would like to describe the process again, which I was following:
- I created a queue and assigned 2 workers to the queue.
- In the pipeline.py file, to start the pipeline I used
pipe.start(queue="queue_remote")
and for the tasks I usedpipe.set_default_execution_queue('queue_remote')
- In the
working_dir = ev_xxxx_xxtion/clearml
I executed the code usingpython3 pipeline.py
- The pipeline was initiated on queue "
queue_remote
" on worker 01 & the next tasks were initiated on queue "queue_remote
" on worker 02 and it failed, as it couldn't find the modules in worker 02.
@<1626028578648887296:profile|FreshFly37> I see that create_dataset
doesn't have a repo set. Can you try setting it manually via the repo
repo_branch
repo_commit
arguments in the add_function_step
method?
sure, I'll add those details & check. Thank you
Thank you @<1523701435869433856:profile|SmugDolphin23> It is working now after the addition of repo details into each task. It seems that we need to specify repo details in each task to pull the code & execute the tasks on the worker.