Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I'M Trying To Use

Hi, I'm trying to use clearml-agent in Docker mode and I want to set some environment variables that include sensitive information such as passwords when running clearml-task .
I added the keys containing sensitive info to hide_docker_command_env_vars and it seems to work from what I can see in the Console view in the ClearML app.
But the problem is that credentials are still shown in the Execution->Container->Arguments in their plain format and they aren't masked out as in Console view.
Can you please help me with advice on how to fix this?

  
  
Posted one year ago
Votes Newest

Answers 19


hey SoggyBeetle95
You're right that's an error on our part 🙂
Could you please open an issue in https://github.com/allegroai/clearml-server/issues so we can track it?
We'll update there once a fix for that issue will be released! 😄

  
  
Posted one year ago

Ohh, if this is the case then it kind of makes sense to store on the Task itself. Which means the Task object will have to store it, and then the UI will display it :(
I think the actual solution is a vault , per user, which would allow users to keep their credentials on the sever, the agent to pass those to the Task when it spins it, based on the user. Unfortunately the vault feature is only available on the paid/enterprise version ( with RBAC etc.).
Does that make sense?

  
  
Posted one year ago

SoggyBeetle95 you can configure the credentials in the clearml.conf running on the agent machines:
https://github.com/allegroai/clearml-agent/blob/a5a797ec5e5e3e90b115213c0411a516cab60e83/docs/clearml.conf#L320
(I'm assuming these are storage credentials)
If you need general purpose env variables, you can ad them here:
https://github.com/allegroai/clearml-agent/blob/a5a797ec5e5e3e90b115213c0411a516cab60e83/docs/clearml.conf#L149
with ["-e", "MY_VAR=MY_VALUE"]

  
  
Posted one year ago

hi SoggyBeetle95
i reproduced the issue, could you confirm me that it is the issue ?
here is what i did :
i declared some secret env var in the agent section of clearml.conf i used extra_keys to have hidden on the console, it is indeed hidden but in the execution -> container section it is clear

  
  
Posted one year ago

We would prefer for it to be per Task secret so team members can use their creds and passwords

  
  
Posted one year ago

Hi David, I will open an issue. Do you have an estimate when this could be fixed?

  
  
Posted one year ago

Alrighty...let me check

  
  
Posted one year ago

SoggyBeetle95 the question is, where does clearml stores these arguments, and the answer is on the Task object (from there the agent will take them and apply to the docker execution). Now since all users see all the tasks, they also see these arguments. Wdyt?

  
  
Posted one year ago

SoggyBeetle95 is this secret a per Task secret, or is it for the agent itself (I.e. for all Tasks the agent will spin)?

  
  
Posted one year ago

Will these general env variables show in the ClearML app?

  
  
Posted one year ago

Yes, I just define the env variables when using clearml-task instead of defining them in conf file

  
  
Posted one year ago

Okay, I think I get what you're saying. How can I set up a clearml-agent with access-all creds with custom entries?

  
  
Posted one year ago

I think they should not 🙂

  
  
Posted one year ago

This is the server version:
WebApp: 1.5.0-186 • Server: 1.5.0-186 • API: 2.18This is version of clearml-task:
ClearML launch - launch any codebase on remote machine running clearml-agent Version 1.1.6

  
  
Posted one year ago

image
image

  
  
Posted one year ago

SoggyBeetle95 maybe it makes sense to configure the agent with an access-all credentials? Wdyt

  
  
Posted one year ago

AgitatedDove14 Maybe I'm missing something, but clearml-task has --docker_args argument which enables passing any env variable to the docker container that will be running the code.

My idea is that each team member adds their creds to --docker_args which would pass it to the docker container and the creds would be available for the task.

It seems to me that the only problem in passing credentials via --docker_args is that are currently shown in the plain form in the app.

Am I missing something obvious?

  
  
Posted one year ago

Hmm, that indeed looks weird 😄 Let me reproduce. Just making sure, what version of the server \ SDK are you using?

  
  
Posted one year ago
540 Views
19 Answers
one year ago
12 days ago
Tags
Similar posts