Hi ObedientDolphin41
However, all of the pipelines tasks are ran on the same queue. Could I be missing something?
The pipeline Task itself is running on a dedicated queue (meaning agent/s) usually because the pipeline logic is mostly idling, where as the components themselves are doing the actual compute.
Specifically you can control the pipeline logic queue with pipeline_execution_queue
https://github.com/allegroai/clearml/blob/7016138c849a4f8d0b4d296b319e0b23a1b7bd9e/clearml/automation/controller.py#L3593
Does that make sense ?
Hi AgitatedDove14
My bad, I worded my question wrong I see, I meant the tasks of the pipeline’s components. (it shows that I’m a newbie 😅 )
This does make perfect sense though! The problem seems to just be that the components themselves are ran on the same queue as the pipeline logic, even though I configured it differently
This is ran by using the UI’s ‘Run’ button without the ‘Advanced configuration’
My bad, I worded my question wrong I see,
LOL no worries 🙂
Any chance you have some "debug" leftover in the Pipeline code:
https://github.com/allegroai/clearml/blob/7016138c849a4f8d0b4d296b319e0b23a1b7bd9e/examples/pipeline/pipeline_from_decorator.py#L113
Maybe we should show a warning when we it is being called, or ignore it when running via an agent ...
Oh yup, that seems very possible since I run it with the run_locally()
and then clone this task in the UI
This workflow however is the only way I have found to easily fix my previous ‘Module not found’ errors
This workflow however is the only way I have found to easily fix my previous ‘Module not found’ errors
Hmm okay make sense,
Did you try to set these ?
or even hack the sys.path with something likeimport sys, os sys.path.insert(0, os.path.abspath(os.path.dirname(__file__)+"/../")
Not yet, working on running the autoscaler for now, and picking this up again later 🙂