Reputation
Badges 1
9 × Eureka!@<1657556312684236800:profile|ManiacalSeaturtle63> yep... that updates indeed the clearml.config in the remote agent pod in the cluster 😃
clearmlConfig: |-
sdk {
aws {
s3 {
# default, used for any bucket not specified below
key: ""
secret: ""
region: ""
bucket: "clearml"
credentials: [
{
# This will apply to all buckets in this host (unless key/value...
Hey @<1523701205467926528:profile|AgitatedDove14> ,
sorry, I am quite new to slack... forgot to submit my changes of the answer...
When you are saying parallel what do you mean? from multiple machines ?
yes, or (because I deployed clearml using helm in kubernetes) from the same machine, but multiple pods (tasks).
Once a dataset was finalized the only way to add files is to add another version that inherits from the previous one (i.e. the finalized version becomes the parent of the ...
@<1523701205467926528:profile|AgitatedDove14>
When you are saying parallel what do you mean? from multiple machines ?
I think I need to use git credentials in the agent helm chart. Do I really need to specify a remote git repo and if so, how can i do that? Is that also something, that need to be configured in the helm chart? or in the code of the pipeline
@<1523701087100473344:profile|SuccessfulKoala55> thanks a lot!
@<1657556312684236800:profile|ManiacalSeaturtle63> o.O wow ,... thanks for the awesome hint! how blind I was to not see that in the values file! THANKS a lot... will try that
actually, that solved my question 😃
@<1523701087100473344:profile|SuccessfulKoala55>
to question 1:
passing Dataset artifacts between tasks seems to be not possible, getting the following error message:
TypeError: cannot pickle '_thread.lock' object.
So i guess its not possible to upload files from different tasks in parallel to the dataset, before finalizing it.