Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey Guys, Do You Have Any Plans To Add Functionality To Export Training Config With All Hyperparameters To The Different Formats, Such As Training Command Line Command, Yaml, Etc.?

hey guys, do you have any plans to add functionality to export training config with all hyperparameters to the different formats, such as training command line command, YAML, etc.?

  
  
Posted 4 years ago
Votes Newest

Answers 11


DilapidatedDucks58 if you have so many parameters, why don't you use the
task.connect_configuration(dict)
It will put it in the artifacts, as an editable json alike string.

  
  
Posted 4 years ago

You can always access the entire experiment data from python
'Task.get_task(Id).data'
It should all be there.
What's the exact use case you had in mind?

  
  
Posted 4 years ago

yeah, I am aware of trains-agent, we are planning to start using it soon, but still, copying original training command would be useful

  
  
Posted 4 years ago

It's dead simple to install:
Pip install trains-agent
the.n you can simply do:
Trains-agent execute --id myexperimentid

  
  
Posted 4 years ago

Hmmm, that actually connects with something we were thinking about: introducing sections to the hyper parameters. This way we could easily differentiate between the command line arguments and other types of parameters. DilapidatedDucks58 what do you think?

  
  
Posted 4 years ago

The idea is that it is not necessary, using the trains-agent you can not only launch the experiment on a remote machine, you can override the parameters, not just cmd line arguments, but any dictionary you connected with the Task or configuration...

  
  
Posted 4 years ago

BTW copying the cmd line assumes that you are running it in the same machine...

  
  
Posted 4 years ago

not necessarily, command usually stays the same irrespective of the machine

  
  
Posted 4 years ago

copy-pasting entire training command into command line 😃

  
  
Posted 4 years ago

this definitely would be a nice addition. number of hyperparameters in our models often goes up to 100

  
  
Posted 4 years ago

Is this the case?

  
  
Posted 4 years ago
1K Views
11 Answers
4 years ago
one year ago
Tags