Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Another Quick Question About Fileservers And Clearml-Agent: Clearml-Agent Seems To Ignore The Output Destination Set In The Task Config

Another quick question about fileservers and clearml-agent:
clearml-agent seems to ignore the output destination set in the task config sdk.development.default_output_uri and instead always use api.file_server in the credentials. Is this expected behavior? Can I not set the files_server on a per task-basis?

  
  
Posted one year ago
Votes Newest

Answers 26


Makes sense, but this means that we are not able to tell clearml-agent where to save on a per-task basis? I see the output_destination set correctly in clearml web interface, but as you say, clearml-agent always uses its api.fileserver ?

  
  
Posted one year ago

@<1576381444509405184:profile|ManiacalLizard2> Just so I understand correctly:
You are saying that in your local, user-specific, clearml.conf you set the api.files_server , but in your remote, clearml-agent, clearml.conf you left it empty?

  
  
Posted one year ago

the config that I mention above are the clearml.conf for each agent

  
  
Posted one year ago

nope, we are self-hosted in Azure

  
  
Posted one year ago

so in your case, in the clearml-agent conf, it contains multiple credential, each for different cloud storage that you potential use ?

  
  
Posted one year ago

The debug samples? or the artifacts/models?

Both.

Yes, change the Task's output destination in the UI (or programmatically)

This has no effect. I am not able to change the files_sever, e.g. I can not change from None to None
If my files_server is None , it will always look there no matter what I set as output destination.

  
  
Posted one year ago

no. I set apo.file_server to the None in Both the remote agent clearml.conf and my local clearml.conf
In which case, both case where the code is ran from local or remote, will store metrics to cloud storage

  
  
Posted one year ago

@<1576381444509405184:profile|ManiacalLizard2> Yea, that makes sense. However, my problem is that I do not want to set it on the remote clearml-agent, since every use may have a different storage. E.g. one user pushes to Azure, while another one pushes to S3

  
  
Posted one year ago

@<1576381444509405184:profile|ManiacalLizard2> Thank you, but afaik this only works locally and not if you run your task on a clearml-agent!

  
  
Posted one year ago

@<1523701868901961728:profile|ReassuredTiger98>
Manually set both:
None
None
To where you want your files to be uploaded

  
  
Posted one year ago

Or maybe a different question: What is not

Artifacts and Models. debug samples (or anything else the Logger class creates)

?

Also it is not possible to use multiple files server? E.g. log tasks on different S3 buckets without changing clearml.conf

  
  
Posted one year ago

If you are using multi storage place, I don't see any other choice than putting multi credential in the conf file ... Free or Paid Clearml Server ...

  
  
Posted one year ago

I think in the paid version there is this configuration vault, so that the user can pass their own credentials securely to the agent.

  
  
Posted one year ago

@<1576381444509405184:profile|ManiacalLizard2> Yes, exactly. I just didn't know how, but now it is all working 🙂
And yes, I have multiple credentials in the clearml.conf of the agents. It's not a good solution, but since I am currently limited to the free version of ClearML, it is the best I could do.

  
  
Posted one year ago

oh ..... did not know about that ...

  
  
Posted one year ago

Thanks a lot, now I think I understand.

Debug samples can only be controlled via api.file_server (or programatically)

Could you guide me how to approach this programmatically? Can I implement my own storage adapter for debug samples with ClearML interfaces or am I on my own?

  
  
Posted one year ago

@<1523701868901961728:profile|ReassuredTiger98> I found that you an set the file_server in your local clearml.conf to your own cloud storage. In our case, we use something like this in our clearml.conf:

api {
   file_server: "azure://<account>..../container"
}

All non artifact model are then store in our azure storage. In our self-hosted clearml setup, we don't even have a file server running alltogether

  
  
Posted one year ago

@<1523701205467926528:profile|AgitatedDove14> Thank you very much for your guidance. Setting these manually works for me!

  
  
Posted one year ago

right, in which case you want to dynamically change with your code, not with the config file. This is where the Logger.set_default_output_upload come in

  
  
Posted one year ago

Makes sense, but this means that we are not able to tell clearml-agent where to save on a per-task basis?

The debug samples? or the artifacts/models?

Also it is not possible to use multiple files server? E.g. log tasks on different S3 buckets without changing clearml.conf

Yes, change the Task's output destination in the UI (or programmatically)

  
  
Posted one year ago

@<1576381444509405184:profile|ManiacalLizard2> I ll check again 🙂 thanks

  
  
Posted one year ago

Debug samples can only be controlled via api.file_server (or programatically)
Model/Artifacts see above

This has no effect. I am not able to change the files_sever, e.g. I can not change from

You are Not changing the files_server just where your Taskj uploads Models/Artifacts, these are two diff things (and again Only applies to Artifacts/Models)

  
  
Posted one year ago

Hi @<1523701868901961728:profile|ReassuredTiger98>
The sdk.development.default_output_uri is used for Artifacts and Models. debug samples (or anything else the Logger class creates) will use the api.file_server
On the Task itself, you have the "output destination" (in the Execution tab) which would override the "output_uri" on a Task level
Does that make sense ?

  
  
Posted one year ago

but afaik this only works locally and not if you run your task on a clearml-agent!

Isn;t the agent using the same clearml.conf ?
We have our agent running task and uploading everything to Cloud. As I said, we don;t even have file server running

  
  
Posted one year ago

@<1576381444509405184:profile|ManiacalLizard2> Maybe you are using the enterprise version with the vault? I suppose the enterprise version is running differently, but I dont have experience with it.
For the open-source version, each clearml-agent is using it's own clearml.conf

  
  
Posted one year ago

None

  
  
Posted one year ago
1K Views
26 Answers
one year ago
one year ago
Tags