Hi SteadyFox10
I promised to mention here once we start working on ignite integration, you can check it here:
https://github.com/jkhenning/ignite/tree/trains-integration
Feel free to provide insights / requests 🙂
As for the model upload. The default behavior is
torch.save() calls will only be logged , nothing more. But, if you pass to the Task.init output_uri field, then all your models will be uploaded automatically. For example:task = Task.init('examples', 'model upload test', output_uri='
s3://bucket/stuff/ ')
Will cause any model stored locally to be uploaded (in the background) to sub-folders (project/experiment) on the bucket/stuff
on your S3 account.
The really cool thing is that even if your code does not include the output_uri
argument, and you are running your experiment with trains-agent , then in the Web UI under "Execution" tab look for "Output : Destination" field. Anything you write there will be as if you added it to the Task.init
output_uri. So all you have to do is just write there, for example " http://trains-server-ip:8081/ " and it will upload all the models to the trains-server (obviously you can also write, s3:// or gs:// or azure://). Then in the artifacts models, you'll see the link to the model itself. Also notice that you can get back the models when you continue the training with the previous model weight, all from code. Here is an example: https://github.com/allegroai/trains/issues/50#issuecomment-607541112
Hi SteadyFox10 , the TrainsLogger
PR was just merged (see https://github.com/pytorch/ignite/pull/1020 )
Wow, that really nice. I did almost the same code, TrainsLogger
, TrainsSaver
, and all the OutputHandler
. I'll use your version instead and put any comment if I find something.
Hi SteadyFox10
I'll use your version instead and put any comment if I find something.
Feel free to join the discussion 🙂 https://github.com/pytorch/ignite/issues/892
Thansk for theÂ
ouput_uri
 can I put in theÂ
~/trains.conf
 file ?
Sure you can 🙂
https://github.com/allegroai/trains/blob/master/docs/trains.conf#L152
You can add it in the trains-agent machine's conf file, or/and on your development machine. Notice that once you run an experiment with "default_output_uri" set (or with output_uri in task.init), the Web UI will reflect the used value in "Output : Destination" so you have better visibility
Pretty good job. I'll try to use it as soon as possible.
Thansk for the ouput_uri
can I put in the ~/trains.conf
file ?