Perfect! Thanks SuccessfulKoala55 , that would be an acceptable workaround until setup_upload also supports Azure 🙂 🙌
Hi GiganticMole91 ,
I see that the storage settings are also available through environment variables, but I'm worried that the environment variables have already been parsed at that time.
I'm not sure I understand. Can you elaborate? How do you run in remotely? Do you raise an instance each time or are your instances persistent?
GiganticMole91 if you wan to hack it, this is how:
` from clearml.storage.helper import StorageHelper
from clearml.backend_config.bucket_config import AzureContainerConfig
StorageHelper._azure_configurations._container_configs.append(
AzureContainerConfig(account_name="<account_name>", account_key="<account_key>", container_name="<container_name>")
) `
GiganticMole91 for S3, I think you're looking for task.setup_upload()
Hi CostlyOstrich36 , thanks for answering. We are using compute instances through the Machine Learning Studio in Azure. They basically work by spinning up an instance, loading a docker-image and executing a specific script in a folder that you upload along with the docker-image. Nothing is persisted between runs and there is no clear notion of a "user" (when thinking of ~/.clearml.conf at least).
SuccessfulKoala55 yeah, sorry, should have mentioned that our storage is also Azure (blob storage). I couldn't find the documentation for task.setup_upload()
online, but the current version of the source code states that
Setup upload options (currently only S3 is supported)
as you mentioned. I'm using v1.5.0.