and then also write down my git username and password.
In the case of api call,
given that i have id of the task I want to stop, I would make a post request to [CLEARML_SERVER_URL]:8080/tasks.stop
with the request body set up like the one mentioned in the api?
is this the correct way to upload an artifact?
checkpoint.split('.')[0] is the name that I want it assigned and the second argument is the path to the file.
I have a lot of anonymous tasks running which I would like to close immediately.
Ok so update. It works now. last steps I did I can remember to fix it are.
Well yeah, you can say that. In add function step, I pass in a function which returns something. And since I've written the name of the returned parameter in add_function_step, I can use it, but I can't seem to figure out a way to do something similar using a task in add_step
I got to that conclusion I think yeah. Basically, can access them as artifacts.
Thus I wanted to pass the model id from the prior step to the next one.
I'm kind of at a point where I don't know a lot of what to even search for.
Its a simple DAG pipeline.
I have a step, at which I want to run a task which finds the model I need.
I don't think I changed anything.
Honestly anything. I tried looking up on youtube but There's very little material there, especially which is up to date. It's understandable given that ClearML is still in beta. I can look at courses / docs. I just want to be pointed in the right direction as to what I should look up and study
As I wrap my head around that, in terms of the example given in the repo, can you tell me what the serving example is in terms of the explanation above and what the triton serving engine is, in context to the above explanation
I'm assuming the triton serving engine is running on the serving queue in my case. Is the serving example also running on the serving queue or is it running on the services queue? And lastly, I don't have a clearml agent listening to the services queue, does clearml do this on its own?
Can you please share the endpoint link?
AnxiousSeal95 I'm trying to access the specific value. I checked the type of task.artifacts and it's a ReadOnlyDict. Given that the return value I'm looking for is called merged_dataset_id, how would I go about doing that?
I checked the value is being returned, but I'm having issues accessing merged_dataset_id in the preexecute_callback like the way you showed me.
AnxiousSeal95 I just have a question, can you share an example of accessing an artifact of a previous step in the pre execute callback?
I initially wasn't able to get the value this way.
I'm both printing it and writing it to a file
Also, the steps say that I should run the serving process on the default queue but I've run it on a queue I created called a serving queue and have an agent listening for it.
Thank you, this is a big help. I'll give this a go now.
Collecting idna==3.3
Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting importlib-metadata==4.8.2
Using cached importlib_metadata-4.8.2-py3-none-any.whl (17 kB)
Collecting importlib-resources==5.4.0
Using cached importlib_resources-5.4.0-py3-none-any.whl (28 kB)
ERROR: Could not find a version that satisfies the requirement jsonschema==4.2.1 (from -r /tmp/cached-reqsm1gu3664.txt (line 19)) (from versions: 0.1a0, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8.0, 1.0.0, 1.1.0, 1.2.0, 1.3.0, 2.0...
I'm not using decorators. I have a bunch of function_steps followed by a normal task step, where I've passed a base_task_id.
I want to check the value of one of the functional steps, and if it holds true, I want to execute the task step otherwise I want the pipeline to end there, since the task step is the last one.
I then did what MartinB suggested and got the id of the task from the pipeline DAG, and then it worked.
AnxiousSeal95 Basically its a function step return. if I do, artifacts.keys(), there are no keys, even though the step prior to it does return the output
You could be right, I just had a couple of packages with this issue so I just removed the version requirement for now. Another issue that might be the case, might be that I'm on ubuntu some of the packages might've been for windows thus the different versions not existing