Reputation
Badges 1
149 × Eureka!AgitatedDove14
`
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
error: Could not fetch origin
Repository cloning failed: Command '['git', 'fetch', '--all', '--recurse-submodules']' returned non-zero exit status 1.
clearml_agent: ERROR: Failed cloning repository.
- Make sure you pushed the requested commit:
(repository='git@...', branch='main', commit_id='...', tag='', docker_cmd='registry.gitlab.com/...:...', en...
in order to work with ssh cloning, one has to manually install openssh-client to the docker image, looks like that
@<1523701205467926528:profile|AgitatedDove14> clearml 1.1.1
Yeah, of course it is in draft mode ( Task.import_task
creates a task in draft mode, it is the lower task on the screenshot)
pipeline launches on the server anyway (appears on the web UI)
I initialize tasks not as functions, but as scripts from different repositories, with different images
AgitatedDove14 by task you mean the training task or the separate task corresponding to the model itself? The former won't work since I don't want to delete the training task, only the models
You are right, I had [None]
as parents in one of the tasks. Now this error is gone
CostlyOstrich36 hi! yes, as I expected, it doesn't see any files unless I call add_files
first
But add_files
has no output_url
parameter and tries to upload to the default place. This returns 413 Request Entity Too Large
error because there are too many files, so using the default location is not an option. Could you please help with this?
The lower task is created during import_task
, the upper one - during actual execution of the step, several minutes after
I have a base task for each pipeline step. When I initialize a pipeline, for each step I clone the corresponding task, modify it and add it as a step. Tasks are launched from a pipeline, not cli. I'm absolutely sure docker argument is not empty (I specify it with export_data['container']['image'] = '
http://registry.gitlab.com/cherrylabs/ml/clearml-demo:clearml '
, and it shows on Web UI)
@<1523701070390366208:profile|CostlyOstrich36> Yes, I'm self deployed, and the company I want to share it with is also self deployed
AnxiousSeal95 We can make a nested pipeline, right? Like if the top pipeline calls add_step
to create steps from tasks, and then we decompose any single step further and create a sub-pipeline from decorators there. We should be able to do that, because PipelineController is itself a task, right?
Also, is there a way to unfold such nested pipeline into a flat pipeline? So that only a single pipeline task is created, and it draws a single detailed DAG in PLOTS
tab?
@<1523701435869433856:profile|SmugDolphin23> could you please give me a link to it? I can't find it on github... Here I see only one comment
None
Very nice for small pipelines (where every step could be put into a function in a single repository)
But in the same time, it contains some keys that cannot be modified with task_overrides
, for example project_name
SparklingElephant70 in WebUI Execution/SCRIPT PATH
CostlyOstrich36 idk, I need to share it to see
how do I share it?
CostlyOstrich36 thank you for the answer! Maybe I just can delete old models along with corresponding tasks, seems to be easier
What exactly we need to copy? I believe we have already copied everything, but it keeps throwing "Fetch experiment failed" error
if fails during add_step
stage for the very first step, because task_overrides
contains invalid keys
I think it would be intuitive to have an exact name
or introduce another parameter that regulates whether name
is regex
And when it is a regex, it can return all matched models (e.g. as list) rather than only the last one