Reputation
Badges 1
383 × Eureka!Any updates on trigger and schedule docs 🙂
What’s the point of saying General?
create_task_from_function
I was looking at options to implement this just today, as part of the same remote debugging that I was talking of in this thread
On a related note - is it possible to get things like ${stage_data.artifacts.dataset.url}
from within a task rather than passing params in add_step
?
Yeah mostly. With k8s glue going, want to finally look at clearml-session and how people are using it.
So General would have created a General instead of Args?
I am doing something like this with a yaml based pipelines DSL
Example:
name: ml-project template: nbdev pipelines_runner: gitlab pipelines: pipeline-1: steps: - name: "publish-datasets" task_script: "mlproject/publish_datasets.py" - name: "training" task_script: "mlproject/training.py" parents: ["publish-datasets"] - name: "test" task_script: "mlproject/test.py" parents: ["training"]
Have cli which goes through each of the tasks and creates them
AS in clone the task from the UI and run it
Hey SuccessfulKoala55 - this was related to the previous message. Had it clarified with AgitatedDove14
is it known?
Any reason to not have those as two datasets?
Updating to 1.1.0 gives this error:
ERROR: Could not push back task [e55e0f0ea228407a921e004f0d8f7901] to k8s pending queue [c288c73b8c434a6c8c55ebb709684b28], error: Invalid task status (Task already in requested status): current_status=queued, new_status=queued
I guess the question is - I want to use services queue for running services, and I want to do it on k8s
AgitatedDove14 - thoughts on this? I remember that it was Draft before, but maybe because it was in a notebook vs now I am running a script?
What happens if I do blah/dataset_url
?
AgitatedDove14 - these instructions are out of date? https://allegro.ai/clearml/docs/docs/deploying_clearml/clearml_server_kubernetes_helm.html
It did pick it from the task?
If i publish a keras_mnist model and experiment on, each of it gets pushed as a separate Model entity right? But there’s only one unique model with multiple different version of it
Do people generally update the same model “entry”? That feels so wrong to me…how do you reproduce a older model version or do a rollback etc?
I have a wrapper SDK over clearml that includes default conf and others are loaded from secret manager / env vars as needed
Thanks AlertBlackbird30
AgitatedDove14 - any doc yet for scheduler? Is it essentially for just time based scheduling?
I then install this wrapper SDK in my containers, notebook instances etc
Ok couldn’t see it in the docs - https://clear.ml/docs/latest/docs/references/sdk/task
sure, will do AlertBlackbird30