ConvolutedChicken69
basically the cleamrl-data needs to store an immutable copy of the delta changes per version, if the files are already uploaded, there is a good chance they could be modified...
So in order to make sure we you have a clean immutable copy, it will always upload the data (notice it also packages everything into a single zip file, so it is easy to manage).
Assuming you define a user/password for the agent, you can either use the username/password as the key/secret, or log into the UI "as the agent" and create a set of credentials using the profile page
Hi ConvolutedChicken69 ,
You can give it whatever credentials you'd like 🙂 - are you using the default UI login mode, or did you set up users and passwords on the server?
users and passwords for the web login
and another question.. how can i get support over the phone? i feel like i have a lot of questions about setting this up and that would be easier
Well, you can contact the ClearML sales team for that 🙂
Hi ConvolutedChicken69 , the Dataset.upload()
will upload the data as an artifact to the task and will allow others to use the dataset (ClearML agents for example, running and using the data with Dataset.get()
).
So you can either create a set of credentials from your own user and give that to the agent, or be nice, and simply define a user/password for the agent 🙂
hi i have another question- when setting up the agent on the worker machine, what credentials do i give it? a user's?
ok ya i thought maybe to create a user for the agent