Reputation
Badges 1
38 × Eureka!now the question how to make the production stuff to work
the question how does ClearML know to create env and what files does it copy to the task/homes/yossi/.clearml/venvs-builds/3.7/
` from clearml import PipelineController
We will use the following function an independent pipeline component step
notice all package imports inside the function will be automatically logged as
required packages for the pipeline execution step
def step_one(pickle_data_url):
# make sure we have scikit-learn for this step, we need it to use to unpickle the object
import sklearn # noqa
import pickle
import pandas as pd
from clearml import StorageManager
pickle_da...
Still missing some info about how to make the worker actually pull the task from the queue.
so it looks for requirement.txt?
from clearml import PipelineController
TimelyPenguin76
thanks again... I have tried what you have suggested. It does inject the dir, but not cloning the whole content.
os.system("tree") from within demo.py pipeline step function:
└── test-demo-clearml.git ├── demo.py ├── __init__.py ├── local_dir └── step_one_task.pyvs.
from the actual tree (under ~/clearml_using_local_lib )
` tree
.
├── demo.py
├── init.py
└── local_dir
├── hello.py
├── init.py
└── pycache
└── __i...
based on what reqeuirment.txt manifest?
locally and remotely
ClearML Task: created new task id=bda6736172df497a83290b2927df28a2 ClearML results page: 2022-06-30 12:35:16,119 - clearml.Task - INFO - No repository found, storing script code instead ClearML pipeline page: `
2022-06-30 12:35:25,434 - clearml.automation.controller - INFO - Node "step_two" missing parent reference, adding: {'step_one'}
2022-06-30 12:35:25,436 - clearml.automation.controller - INFO - Node "step_three" missing parent reference, adding: {'step_two'}
2022-06-30 12:35...
the only way any computer could figure this out is by running it
how this is possible?
if a==True: import torch else: import tensorflow
I have a simple pipline that works
PIP Mode
By default, ClearML Agent works in PIP Mode, in which it uses
as the package manager. When ClearML runs, it will create a virtual environment (or reuse an existing one, see
). Task dependencies (Python packages) will be installed in the virtual environment.
how does it know what are the dependencies of a task?
if this was possible we wouldn't need pip
nice! Thank you
there was an issue with the layout of my simple git example. So if you do the above it should work
I think I am missing something
consider this case:
Task.add_requirements
it's an undecidable problem
I see! SuccessfulKoala55 what is the right way to config it? via vim or is there any commandline tool?
would love to hear your thoughts
Indeed locally it does work if I run pipe.start_locally
This is extremely helpful! I decided to go with pipeline from functions.
Everything looks great, but the tasks are pending. Am I missing some executor or something like that?
so it's not auto-generated. What's the spec of this conf file?
or how can I add some libraries I'd like it to pip install in the new environment