Reputation
Badges 1
38 × Eureka!from clearml import PipelineController
or how can I add some libraries I'd like it to pip install in the new environment
there was an issue with the layout of my simple git example. So if you do the above it should work
I have a simple pipline that works
now the question how to make the production stuff to work
the only way any computer could figure this out is by running it
we have a diff name for this file in the repo
so it's not auto-generated. What's the spec of this conf file?
Indeed locally it does work if I run pipe.start_locally
I created a worker that listens to the queue, but the worker doesn't pull the tasks (they are pending)
ran clearml-agent daemon --queue rudolf --detached
TimelyPenguin76
thanks again... I have tried what you have suggested. It does inject the dir, but not cloning the whole content.
os.system("tree")
from within demo.py pipeline step function:
└── test-demo-clearml.git ├── demo.py ├── __init__.py ├── local_dir └── step_one_task.py
vs.
from the actual tree (under ~/clearml_using_local_lib
)
` tree
.
├── demo.py
├── init.py
└── local_dir
├── hello.py
├── init.py
└── pycache
└── __i...
Still missing some info about how to make the worker actually pull the task from the queue.
nice! Thank you
the question how does ClearML know to create env and what files does it copy to the task/homes/yossi/.clearml/venvs-builds/3.7/
if a==True: import torch else: import tensorflow
based on what reqeuirment.txt manifest?
how this is possible?
locally and remotely
This is extremely helpful! I decided to go with pipeline from functions.
Everything looks great, but the tasks are pending. Am I missing some executor or something like that?
would love to hear your thoughts
` from clearml import PipelineController
We will use the following function an independent pipeline component step
notice all package imports inside the function will be automatically logged as
required packages for the pipeline execution step
def step_one(pickle_data_url):
# make sure we have scikit-learn for this step, we need it to use to unpickle the object
import sklearn # noqa
import pickle
import pandas as pd
from clearml import StorageManager
pickle_da...
thank you, guys! I think now it works! Amazing step-by-step support! This is sublime!
for future ref., this is a summary of what I have done:
create a project on clearml webUI create a queue on clearml webUI run an agent /homes/yosefhaie/.conda/envs/devops/bin/clearml-agent daemon --create-queue --queue <queue-name>
use this test script:` from clearml import PipelineController
We will use the following function an independent pipeline component step
notice all package imports ins...
PIP Mode
By default, ClearML Agent works in PIP Mode, in which it uses
as the package manager. When ClearML runs, it will create a virtual environment (or reuse an existing one, see
). Task dependencies (Python packages) will be installed in the virtual environment.
how does it know what are the dependencies of a task?
if this was possible we wouldn't need pip
I think I am missing something
SweetBadger76 thanks,
The only thing I am not certain about is. What does agent means in ClearML world? Is the queue manager or the pipelien?
Task.add_requirements