Reputation
Badges 1
383 × Eureka!Ah ok. Kind of getting it, will have to try the glue mode
I would like to create a notebook instance and start using it without having to do anything on a dev box
AlertBlackbird30 :
--remote-gateway [REMOTE_GATEWAY] Advanced: Specify gateway ip/address to be passed to interactive session (for use with k8s ingestion / ELB)
I see this in clearml-session - what’s the intent here?
But ok the summary is I guess it doesn’t work in a k8s env
IF there’s a post-task script, I can add a way to zip and upload pip cache etc to s3 - as in do any caching that I want without having first class support in clearml
Initially thought use_current_task in Dataset.create
does it, but that seems to set the DataprocessingTask itself to the current task
I would prefer controlled behavior than some available version being used. Here triggered a bunch of jobs that all went fine and even evaluations were fine and then when we triggered a inference deploy it failed
Ah thanks for the pointer AgitatedDove14
Sounds great!
Is nested projects in the release? I see it in the community server but no mention in the blog or the release notes?
Yes, for datasets where we need GDPR compliance
I only see published getting preference, not a way to filter only to published
Sorry if it was confusing. Was asking if people have setup pipelines automatically triggered on update to datasets
As in if there are jobs, first level is new pods, second level is new nodes in the cluster.
AgitatedDove14 - the actual replication failed. When we run a task by cloning and enqueueing, there is a current task even if I am yet to do a Task.init right?
Can you point me at the relevant code?
Looks like Task.current_task() is indeed None in this case. Bit of log below where I print(Task.current_task()) as first step in the script
Environment setup completed successfully Starting Task Execution: None
The description says this though
A section name associated with the connected object. Default: 'General'
But that itself is running in a task right?
I don’t want to though. Will run it as part of a pipeline
Which kind of access specifically? I handle permissions with IAM roles