Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello Guys I Have A Question About Local Cache Right Now Im Trying To Store In Cache A Pretty Large Dataset (Over 20Mil Files And 3Tb Of Data) I Use A

hello guys

i have a question about local cache
right now im trying to store in cache a pretty large dataset (over 20mil files and 3Tb of data)
i use a dataset.get_local_copy() method and find out an interesting thing

when my downloaded data is close to 1Tb, downloading fails because clearml wants to open zip archives downloaded some time ago. but this zip archive has already been deleted
additionally, it somehow deletes all other cache data

so, i am curious, can i configure the file limit and size limit of my cache?
in the documentation i saw StorageManager.set_cache_file_limit , but not sure enough, is it right way to do it

thanks for your help!

  
  
Posted one year ago
Votes Newest

Answers 8


so. if i understand it right. i should uncomment max_used_bytes = -1 string?
or place there some big number?

  
  
Posted one year ago

Wait, my config looks a bit different, what clearml package version are you using?

  
  
Posted one year ago

in the documentation i only see these parameters

and there is no other options
can you guide me to other cache parameters?
@<1537605940121964544:profile|EnthusiasticShrimp49>
image

  
  
Posted one year ago

    storage {
        cache {
            # Defaults to system temp folder / cache
            default_base_dir: "~/.clearml/cache"
            # default_cache_manager_size: 100
        }

i see

so, if i change default_cache_manager_size variable to some utterly big number (like 1 trillion files) - all should be fine?

and did it have some default value?

  
  
Posted one year ago

it was config generated by clearml init command

i generate another one with clearml-agent init comand

this is how it looks like

    storage {
        cache {
            # Defaults to system temp folder / cache
            default_base_dir: "~/.clearml/cache"
            size {
                # max_used_bytes = -1
                min_free_bytes = 10GB
                # cleanup_margin_percent = 5%
            }
        }

btw

$ pip freeze | grep clearml                                                                               
clearml==1.11.0
clearml-agent==1.5.2
  
  
Posted one year ago

Thanks for pointing this out, we will need to update our documentation. Still, if you manually inspect the ~/clearml.conf file you will see the available configurations

  
  
Posted one year ago

Hey @<1577468626967990272:profile|PerplexedDolphin99> , yes, this method call will help you limit the number of files you have in your cache, but not the total size of your cache. To be able to control the size, I’d recommend checking the ~/clearml.conf file in the sdk.storage.cache section

  
  
Posted one year ago

Yes, that is correct. Btw, not it looks more like my clearml.conf

  
  
Posted one year ago
1K Views
8 Answers
one year ago
one year ago
Tags