I tested ~/clearml.conf
and CLEARML_DEFAULT_OUTPUT_URI
, they are both ignored.
Maybe for more context, I'm using https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch-lightning/pytorch_lightning_example.py to test and the only way I can have the model checkpoints uploaded to the server is if I set output_uri
in the Task.init
.
I base the assumption that I should not have to do that on the following comment from ~/clearml.conf
.
# Default Task output_uri. if output_uri is not provided to Task.init, default_output_uri will be used instead.
Ah ok, thanks. I was hoping to be able to set the default server-wide and not have to tell all users to do it themselves in the code.
Actually, when you say client side, it means it should work in ~/clearml.conf
no?
exactly. it's saved in a lightning_logs
folder where i started the script instead.
Apologies, the error on my side. It's working.
Thanks a lot!
it's saved in a
lightning_logs
folder where i started the script instead.
It should be saved there + it should upload it to your file server
Can you send the Task log? (this is odd)
MistakenDragonfly51 just making sure I understand, on Your machine (the one running the pytorch example),
you have set " CLEARML_DEFAULT_OUTPUT_URI
" / configured the "clearml.conf" file with default_output_uri
, yet the model checkpoint was Not uploaded?
Hi MistakenDragonfly51
I'm trying to set
default_output_uri
in
This should be set wither on your client side, or on the worker machine (running the clearml-agent).
Make sense ?
it means it should work in
~/clearml.conf
no?
Yes exactly
I was hoping to be able to set the default server-wide
I think this type of server-side wide defaults is not supported in the open-source version.
But in most cases, setting it up on the clearml-agents is probably the important thing. btw: you can also set it in an OS environment CLEARML_DEFAULT_OUTPUT_URI