Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Guys, Is There A Way, Analogous To Using

Hi guys,

Is there a way, analogous to using Task.set_credentials(...) , to set credentials for storage programmatically? Like, Task.setup_storage(...) ?

I'm setting up an environment where a Task can either be run locally on some on-prem machine or remotely on Azure. When running on Azure, I use the set_credentials function to initialize the connection to the server, but I also need a way to give it credentials for our storage. I see that the storage settings are also available through environment variables, but I'm worried that the environment variables have already been parsed at that time.

Any tips? :-)

  
  
Posted 2 years ago
Votes Newest

Answers 6


GiganticMole91 if you wan to hack it, this is how:
` from clearml.storage.helper import StorageHelper
from clearml.backend_config.bucket_config import AzureContainerConfig

StorageHelper._azure_configurations._container_configs.append(
AzureContainerConfig(account_name="<account_name>", account_key="<account_key>", container_name="<container_name>")
) `

  
  
Posted 2 years ago

The container name is optional 🙂

  
  
Posted 2 years ago

Perfect! Thanks SuccessfulKoala55 , that would be an acceptable workaround until setup_upload also supports Azure 🙂 🙌

  
  
Posted 2 years ago

GiganticMole91 for S3, I think you're looking for task.setup_upload()

  
  
Posted 2 years ago

Hi GiganticMole91 ,

I see that the storage settings are also available through environment variables, but I'm worried that the environment variables have already been parsed at that time.

I'm not sure I understand. Can you elaborate? How do you run in remotely? Do you raise an instance each time or are your instances persistent?

  
  
Posted 2 years ago

Hi CostlyOstrich36 , thanks for answering. We are using compute instances through the Machine Learning Studio in Azure. They basically work by spinning up an instance, loading a docker-image and executing a specific script in a folder that you upload along with the docker-image. Nothing is persisted between runs and there is no clear notion of a "user" (when thinking of ~/.clearml.conf at least).

SuccessfulKoala55 yeah, sorry, should have mentioned that our storage is also Azure (blob storage). I couldn't find the documentation for task.setup_upload() online, but the current version of the source code states that

Setup upload options (currently only S3 is supported)

as you mentioned. I'm using v1.5.0.

  
  
Posted 2 years ago
1K Views
6 Answers
2 years ago
one year ago
Tags