AgitatedDove14 - any doc yet for scheduler? Is it essentially for just time based scheduling?
Or WARNING should be param
not found, but using General/param
etc
pipeline code itself is pretty standard
now if dataset1 is updated, i want process to update dataset2
Will try it out. Pretty impressed 🙂
Currently we train from Sagemaker notebooks, push models to S3 and create containers for model serving
To confirm, if i have fresh venv with no dependency installed except clearml
I have a requirements.txt file in root, and a script at scripts/script1.py
The script1.py does task.init(), execute_remotely and then imports few dependenceies
Now I run python scripts/script1.py
And it should pick up the installed packages correctly?
Yeah got it. Was mainly wondering if k8s glue was meant for this as well or not
Essentially - 1. run a task normally. 2. clone 3. edit to have only those two lines.
Question - since this is a task, why is Task.currnet_task() None?
don’t know what’s happening there
As we can’t create keys in our AWS due to infosec requirements
Thanks AgitatedDove14 . Have removed Task.current_task() usage for this now. Think I can do without it
Good question 🙂
this is what I am seeing in the logs:
` No tasks in queue 9154efd8a1314550b1c7882981720861
No tasks in Queues, sleeping for 5.0 seconds
No tasks in queue 9154efd8a1314550b1c7882981720861
No tasks in Queues, sleeping for 5.0 seconds
No tasks in queue 9154efd8a1314550b1c7882981720861
No tasks in Queues, sleeping for 5.0 seconds
No tasks in queue 9154efd8a1314550b1c7882981720861
No tasks in Queues, sleeping for 5.0 seconds
K8S Glue pods monitor: Failed parsing kubectl output...
If you don’t mind, can you point me at the code where this happens?
Hey TimelyPenguin76 - i am just using the helm chart and haven’t done any setup on top of that. the agentservices is running as is from the helm chart
Yeah, Curious - is a lot of clearml usecases not geared for notebooks?
What happens if I do blah/dataset_url
?
So General would have created a General instead of Args?
Sagemaker will make that easy, especially if I have sagemaker as the long tail choice. Granted at a higher cost
The minerva one is my custom package AgitatedDove14
I am essentially creating a EphemeralDataset abstraction and creating controlled lifecycle for it such that the data is removed after a day in experiments. Additionally and optionally, data created during a step in a pipeline can be cleared once the pipeline completes
I only see published getting preference, not a way to filter only to published
I get other things from the project like the dataset
As the verify
param was deprecated and now removed
AgitatedDove14 either based on scenario