Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Have Setup A

I have setup a clearml-server running on a Azure VM instance and have used default parameters when it comes to specifying storage locations for data and artefacts. I have managed to create a versioned dataset of the images dataset I am using, by using clearml.Dataset to push these to the clearml-server .

I have now setup an Azure Blob Store container for use with the server, such that all artefacts (model weights etc) and datasets are stored on this blob store.

Q. Would someone mind outlining what the steps are to configuring the default storage locations, such that any artefacts or data which are pushed to the server are stored by default on the Azure Blob Store?

  
  
Posted 3 years ago
Votes Newest

Answers 5


AgitatedDove14 Thanks for that.
I suppose the same would need to be done for any client PC running clearml such that you are submitting dataset upload jobs?

That is, the dataset is perhaps local to my laptop, or on a development VM that is not in the clearml system, but I from there I want to submit a copy of a dataset, then I would need to configure the storage section in the same way as well?

I assume the account name and key refers to the storage account credentials that you can from Azure Storage Explorer?
It gives you two access keys, primary and secondary from the properties tab of the specified storage account.

  
  
Posted 3 years ago

Q. Would someone mind outlining what the steps are to configuring the default storage locations, such that any artefacts or data which are pushed to the server are stored by default on the Azure Blob Store?

Hi VivaciousPenguin66
See my reply here on configuring the default output uri on the agent: https://clearml.slack.com/archives/CTK20V944/p1621603564139700?thread_ts=1621600028.135500&cid=CTK20V944
Regrading permission setup:
You need to make sure you have the Azure blob credentials on the agent's machine, so it can actually access the blob storage. Make sense ?
https://github.com/allegroai/clearml-agent/blob/e93384b99bdfd72a54cf2b68b3991b145b504b79/docs/clearml.conf#L276

  
  
Posted 3 years ago

I suppose the same would need to be done for any 

client 

PC running 

clearml

 such that you are submitting dataset upload jobs?

Correct

That is, the dataset is perhaps local to my laptop, or on a development VM that is not in the 

clearml

 system, but I from there I want to submit a copy of a dataset, then I would need to configure the storage section in the same way as well?

Correct

  
  
Posted 3 years ago

I assume the account name and key refers to the storage account credentials that you can from Azure Storage Explorer?

correct

  
  
Posted 3 years ago

I am bit confused because I can see configuration sections Azure storage in the clearml.conf files, but these are on the client pc and the clearml-agent compute nodes.

So do these parameters have to be set on the clients and compute nodes individually, or is something that can be set on the server?

  
  
Posted 3 years ago
908 Views
5 Answers
3 years ago
one year ago
Tags