I really like how you make all this decoupled !! 🎉
In summary:
Spin down the local server
Backup the data folder
In the cloud, extract the data backup
Spin up the cloud server
nevermind, all the database files are in data folder
Are you talking about this: None
It seems to not doing anything aboout the database data ...
follow the backup procedure, it is basically the same process
Oh, I was assuming you are passing the entire DB backups to the cloud.
Yes, that is what I want to do.
So I need to migrate both the MongoDB database and elastic search database from my local docker instance to the equivalent in the cloud ?
Oh, I was assuming you are passing the entire DB backups to the cloud.
Are you saying you just want the file server on the cloud ? if this is the case, I would just use S3
but when I spin up a new server in the cloud, that server will have it's own mongodb and that will be empty no ?
Basically the links to the file server are saved in both mongo and elastic, so as long as these are host:ip based, at least in theory it should work
I am more curious about how to migrate all the information stored in the local clearml server to the clearml server in the cloud
I understand to from the agent, point of view, I just need to update the conf file to use new credential and new server address.
So if i spin up a new clearml server in the cloud and use the same file server mount point, i will see all task and expriment that i had on the in prem server in the cloud server?
What about migrating existing expriment in the on prem server?
Hi @<1576381444509405184:profile|ManiacalLizard2>
If you make sure all server access is via a host name (i.e. instead of IP:port, use host_address:port), you should be able to replace it with cloud host on the same port