Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Is There An Option To Separate The Storage From The Server? E.G. Deploying My Trains Server On Some Light Machine, And Confguring The Storage To Be Aws S3 Or Something Similar

Is there an option to separate the storage from the server? e.g. deploying my trains server on some light machine, and confguring the storage to be AWS S3 or something similar

  
  
Posted 3 years ago
Votes Newest

Answers 4


The fileserver will store the debug samples (if you have any).

You'll have cache too.

  
  
Posted 3 years ago

Hi WackyRabbit7

When calling Task.init() , you can provide output_uri parameter. This allows you to specify the location in which model snapshots will be stored.
Allegro-Trains supports shared folders, S3 buckets, Google Cloud Storage and Azure Storage.

For example (with S3):

Task.init(project_name="My project", task_name="S3 storage", output_uri="s3://bucket/folder")
You will need to add storage credentials in ~/trains.conf file (you will need to add your aws in this part https://github.com/allegroai/trains/blob/master/docs/trains.conf#L69 ).

  
  
Posted 3 years ago

Cool - so that means the fileserver which comes with the host will stay emtpy? Or is there anything else being stored there?

  
  
Posted 3 years ago

WackyRabbit7

Cool - so that means the fileserver which comes with the host will stay emtpy? Or is there anything else being stored there?

Debug Images and artifacts will be automatically stored to the file server.
If you want your models to be automagically uploaded add the following :
task=Task.init('example', 'experiment', output_uri=' ')(You can obviously point it to any other http/S3/GS/Azure storage)

  
  
Posted 3 years ago
605 Views
4 Answers
3 years ago
one year ago
Tags