Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I'M Using Clearml Over Aws And I Can See That My Ec2 Instance Disk Space Is Filling Up Really Fast. I First Started With 100Gb Disk Space And Once It Filled Up I Increased The Size To 150Gb Which Is Soon To Be Filled Up As Well. I Wanted To Check If T

Hi, I'm using ClearML over AWS and I can see that my ec2 instance disk space is filling up really fast.
I first started with 100GB disk space and once it filled up I increased the size to 150GB which is soon to be filled up as well.
I wanted to check if that is something to expect or is there anything to do about it?

  
  
Posted 2 years ago
Votes Newest

Answers 16


DeliciousStarfish67 , are you running your ClearML server on the aws instance?

  
  
Posted 2 years ago

You can always delete the data. Each folder in /opt/clearml/data/fileserver/ represents the stored outputs of an experiment. If you no longer need the files you can delete them

  
  
Posted 2 years ago

Can you connect directly to the instance? If so, please check how large /opt/clearml is on the machine and then see the folder distribution

  
  
Posted 2 years ago

Where is most of the data concentrated?

  
  
Posted 2 years ago

It is 112GB

  
  
Posted 2 years ago

Thank you guys!

  
  
Posted 2 years ago

I see. I'm guessing you have pretty extensive use in the form of artifacts/debug samples. You can lower the storage usage by deleting some experiments/models though the UI. That should free up some space 🙂

  
  
Posted 2 years ago

So your saying its expected and if I can't delete this data the only option is to keep increasing the volume size?

  
  
Posted 2 years ago

specially /opt/clearml/data/fileserver which is taking 102GB

  
  
Posted 2 years ago

Thank you guys.
SuccessfulKoala55 is there any way to configure clearml server to save debug images and artifacts to s3?

  
  
Posted 2 years ago

Certainly - you do that directly on the clients (SDK, agents)

  
  
Posted 2 years ago

Yes

  
  
Posted 2 years ago

/opt/clearml/data is taking 112GB

  
  
Posted 2 years ago

Here:
https://clear.ml/docs/latest/docs/configs/clearml_conf#agent-section
What you're looking for is this:
sdk.development.default_output_uri

Also configure your api.files_server in ~/clearml.conf to point to your s3 bucket as well 🙂

  
  
Posted 2 years ago

DeliciousStarfish67 the math is simple - if you want the experiments outputs (in this case specifically - the debug images, uploaded artifacts and models), they simply take up storage space (as png/jpg images and whatever files you uploaded as artifacts or models). If you only want the metrics for each experiments, they are stored in a different location and so will not be affected if you delete fileserver data

  
  
Posted 2 years ago

Any docs you can direct me to?

  
  
Posted 2 years ago