Reputation
Badges 1
38 × Eureka!` from clearml import PipelineController
We will use the following function an independent pipeline component step
notice all package imports inside the function will be automatically logged as
required packages for the pipeline execution step
def step_one(pickle_data_url):
# make sure we have scikit-learn for this step, we need it to use to unpickle the object
import sklearn # noqa
import pickle
import pandas as pd
from clearml import StorageManager
pickle_da...
ClearML Task: created new task id=bda6736172df497a83290b2927df28a2 ClearML results page:
2022-06-30 12:35:16,119 - clearml.Task - INFO - No repository found, storing script code instead ClearML pipeline page:
`
2022-06-30 12:35:25,434 - clearml.automation.controller - INFO - Node "step_two" missing parent reference, adding: {'step_one'}
2022-06-30 12:35:25,436 - clearml.automation.controller - INFO - Node "step_three" missing parent reference, adding: {'step_two'}
2022-06-30 12:35...
Still missing some info about how to make the worker actually pull the task from the queue.
thank you, guys! I think now it works! Amazing step-by-step support! This is sublime!
for future ref., this is a summary of what I have done:
create a project on clearml webUI create a queue on clearml webUI run an agent /homes/yosefhaie/.conda/envs/devops/bin/clearml-agent daemon --create-queue --queue <queue-name>
use this test script:` from clearml import PipelineController
We will use the following function an independent pipeline component step
notice all package imports ins...
I created a worker that listens to the queue, but the worker doesn't pull the tasks (they are pending)
ran clearml-agent daemon --queue rudolf --detached
Indeed locally it does work if I run pipe.start_locally
Missing the last piece of the puzzle I believe.
so it's not auto-generated. What's the spec of this conf file?
I see! SuccessfulKoala55 what is the right way to config it? via vim or is there any commandline tool?
there was an issue with the layout of my simple git example. So if you do the above it should work
TimelyPenguin76
thanks again... I have tried what you have suggested. It does inject the dir, but not cloning the whole content.
os.system("tree")
from within demo.py pipeline step function:
└── test-demo-clearml.git ├── demo.py ├── __init__.py ├── local_dir └── step_one_task.py
vs.
from the actual tree (under ~/clearml_using_local_lib
)
` tree
.
├── demo.py
├── init.py
└── local_dir
├── hello.py
├── init.py
└── pycache
└── __i...
Will make it as a local repo. Thank you so much!
would love to hear your thoughts
SweetBadger76 thanks,
The only thing I am not certain about is. What does agent means in ClearML world? Is the queue manager or the pipelien?
This is extremely helpful! I decided to go with pipeline from functions.
Everything looks great, but the tasks are pending. Am I missing some executor or something like that?
we have a diff name for this file in the repo
based on what reqeuirment.txt manifest?
or how can I add some libraries I'd like it to pip install in the new environment
the only way any computer could figure this out is by running it
I have a simple pipline that works
nice! Thank you
if a==True: import torch else: import tensorflow
locally and remotely
from clearml import PipelineController
so it looks for requirement.txt?
how this is possible?