Reputation
Badges 1
148 × Eureka!SuccessfulKoala55 It gives the correct name, that is running_best_pth
!
Maybe displaying 9 or 10 by default would be enough + clearly visible and thick scrollbar to the right
I think it would be intuitive to have an exact name
or introduce another parameter that regulates whether name
is regex
And when it is a regex, it can return all matched models (e.g. as list) rather than only the last one
You should add the wrong key to task_overrides
, not the wrong parameter to add_step
but at that point it hadn't actually added any steps. Maybe failed pipelines with zero steps count as completed
if fails during add_step
stage for the very first step, because task_overrides
contains invalid keys
try add_step(..., task_overrides={'project_name': 'my-awesome-project', ...})
But in the same time, it contains some keys that cannot be modified with task_overrides
, for example project_name
I specifically set is as empty with export_data['script']['requirements'] = {}
in order not to reduce overhead during launch. I have everything installed inside the container
It doesn't install anything with pip during launch, I'm assuming it should take everything from the container itself (otherwise there would be a huge overhead). It simply fails trying to import things in the script
File "preprocess.py", line 4, in <module> from easydict import EasyDict as edict ModuleNotFoundError: No module named 'easydict'
I have a base task for each pipeline step. When I initialize a pipeline, for each step I clone the corresponding task, modify it and add it as a step. Tasks are launched from a pipeline, not cli. I'm absolutely sure docker argument is not empty (I specify it with export_data['container']['image'] = '
http://registry.gitlab.com/cherrylabs/ml/clearml-demo:clearml '
, and it shows on Web UI)
When I launch tasks with a pipeline, they keep complaining about missing pip packages. I run it inside a docker container, and I'm sure these packages are present inside it (when I launch the container locally, run python3 and import them, it works like charm). Any ideas how to fix this?
I don't think so. it is solved by installing openssh-client to the docker image or by adding deploy token to the cloning url in web ui
for https cloning, deploy token is needed
(this is an answer to the previous message)
this is so cursed, it's 10:30 pm
in order to work with ssh cloning, one has to manually install openssh-client to the docker image, looks like that
In short, what helped isgitlab+deploy-token
in gitlab url
AnxiousSeal95 We can make a nested pipeline, right? Like if the top pipeline calls add_step
to create steps from tasks, and then we decompose any single step further and create a sub-pipeline from decorators there. We should be able to do that, because PipelineController is itself a task, right?
Also, is there a way to unfold such nested pipeline into a flat pipeline? So that only a single pipeline task is created, and it draws a single detailed DAG in PLOTS
tab?
Very nice for small pipelines (where every step could be put into a function in a single repository)
There are some questions in this channel already regarding pipeline V2. Is there any tutorial or changelog or examples I can refer to?
I launch everything in docker mode, and since it builds an image on every run, it builds default nvidia/cuda:10.1-cudnn7-runtime-ubuntu18.04
image, which incurs heavy overhead. What if I want to give it my custom lightweight image instead? The same way I do for all individual tasks
of course, I use custom images all the time, the question was how to do it for a pipeline 😆 setting private attributes directly doesn't look as good practice