WackyRabbit7
Cool - so that means the fileserver which comes with the host will stay emtpy? Or is there anything else being stored there?
Debug Images and artifacts will be automatically stored to the file server.
If you want your models to be automagically uploaded add the following :task=Task.init('example', 'experiment', output_uri='
')
(You can obviously point it to any other http/S3/GS/Azure storage)
The fileserver will store the debug samples (if you have any).
You'll have cache too.
Cool - so that means the fileserver which comes with the host will stay emtpy? Or is there anything else being stored there?
Hi WackyRabbit7
When calling Task.init()
, you can provide output_uri
parameter. This allows you to specify the location in which model snapshots will be stored.
Allegro-Trains supports shared folders, S3 buckets, Google Cloud Storage and Azure Storage.
For example (with S3):
Task.init(project_name="My project", task_name="S3 storage", output_uri="s3://bucket/folder")
You will need to add storage credentials in ~/trains.conf
file (you will need to add your aws in this part https://github.com/allegroai/trains/blob/master/docs/trains.conf#L69 ).