Check here:
https://github.com/allegroai/trains/blob/master/docs/trains.conf#L78
You can configure credentials based on the bucket name. Should work for Azure as well
I just installed trains[azure]. Since all my data is on Azure. I don't know about StorageManager.
I see that _AzureBlobServiceStorageDriver need to be updated. Anything else?
trains[azure] give you the possibility to do the following:from trains import StorageManager my_local_cached_file = StorageManager.get_local_copy('azure://bucket/folder/file.bin')
This means you do not have to manually download stuff/ and maintain the cache local cache, the StorageManager will do that for you.
If you do no need that ability, no need to install the trains[azure]
you can just install trains
Unfortunately, we haven't had the time to upgrade to the Azure storage v2 package (they changed the entire interface), and this is why the requirements collides with your installed packages. This is still on the to do list 🙂
The azure section:
https://github.com/allegroai/trains/blob/master/docs/trains.conf#L117
Also, each task might need its own configuration. Data are usually stored in multiple containers. Rather than a single configuration, there should be possibility to do it per task.
There already seems to be support for multiple containers in the code.
Is there an example to configure multiple storage accounts?
Exactly 🙂
If you feel like PR-ing a fix, it will be greatly appreciated 🙂
LazyLeopard18 are you using the StorageManager to access azure:// links?