You can disable the auto-update feature if you'd like to keep your own custom docker-compose.yml
file
Increased to 20, lets see how long will it last 🙂
I get this
` [ec2-user@ip-10-0-0-95 ~]$ docker-compose down
WARNING: The TRAINS_HOST_IP variable is not set. Defaulting to a blank string.
WARNING: The TRAINS_AGENT_GIT_USER variable is not set. Defaulting to a blank string.
WARNING: The TRAINS_AGENT_GIT_PASS variable is not set. Defaulting to a blank string.
ERROR: Couldn't connect to Docker daemon at http+docker://localhost - is it running?
If it's at a non-standard location, specify the URL with the DOCKER_HOST environment variable. `
Depends on the state of your hard-drive
when spinning up the ami i just went for trains recommended settings
SuccessfulKoala55 AppetizingMouse58
[ec2-user@ip-10-0-0-95 ~]$ df -h Filesystem Size Used Avail Use% Mounted on devtmpfs 3.9G 0 3.9G 0% /dev tmpfs 3.9G 0 3.9G 0% /dev/shm tmpfs 3.9G 880K 3.9G 1% /run tmpfs 3.9G 0 3.9G 0% /sys/fs/cgroup /dev/nvme0n1p1 8.0G 6.5G 1.5G 82% / tmpfs 790M 0 790M 0% /run/user/1000
Hi Elior, chances are that you do not have enough space for Elasticsearch on your storage. Please check the ES logs and increase the available disk space.
And depends on what takes the most space
I know, but we really provide the bare minimum since people usually want to try it out and I assume most are price-conscious... I guess we can explain that in the documentation 🙂
but I can't seem to run docker-compose down
It would be useful to create a disk-usage tree detailing the disk usage under the /opt/trains
folder, just so you'll get a feel of what takes the most space (uploaded files, experiment statistics etc.)
Well, you can inspect the ES logs to find out why there's a 0.5GB limit but ES still locks up when there's 1.5GB free, however 8GB storage for the machine is really the absolute minimum, I suggest increasing it. The current price in AWS is $0.08 per GB, so personally I think 50GB is a very reasonable number.
This error just keeps coming back... I already made the watermarks like 0.5gb
what should I paste here to diagnose it?
Well, you can find a Linux command that lists the X largest folders/files and see what's taking the most disk space