is this the correct way to upload an artifact?
checkpoint.split('.')[0] is the name that I want it assigned and the second argument is the path to the file.
Thus I wanted to pass the model id from the prior step to the next one.
but I can't seem to figure out a way to do something similar using a task in add_step
VexedCat68 With "add_step" it assumes the Task you are adding is self contained (i.e. there is no "return object" to serialize), this means you can only add arguments, or use the artifacts the Task (i.e. step) will recreate, obviously you knowing in advance what the step creates. Make sense ?
Can you give a bit more info of how you want the pipeline built and where you want to insert/extract the task id? Also how is the model related? Is it the start of the pipeline?
I got to that conclusion I think yeah. Basically, can access them as artifacts.
Its a simple DAG pipeline.
I have a step, at which I want to run a task which finds the model I need.
Well yeah, you can say that. In add function step, I pass in a function which returns something. And since I've written the name of the returned parameter in add_function_step, I can use it, but I can't seem to figure out a way to do something similar using a task in add_step
VexedCat68 yes 🙂 you can also pass the parent folder and it will zip the entire subfolders into a single artifact
Hi VexedCat68
One of my steps just finds the latest model to use. I want the task to output the id, and the next step to use it. How would I go about doing this?
When you say "I want the task to output the id" do you mean to pass t to the next step:
Something like this one:
https://github.com/allegroai/clearml/blob/c226a748066daa3c62eddc6e378fa6f5bae879a1/clearml/automation/controller.py#L224