Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Is Setting Up Clearml Server On Same Machine That Is Supposed To Be Our Gpu Worker A Good Idea? I Wanted To Have Every Dataset Locally Using Clearml-Data Cli, How Would Gpu Worker Handle Data Fetching, Would It Go Straight From Disc?

Is setting up ClearML server on same machine that is supposed to be our GPU worker a good idea?
I wanted to have every dataset locally using clearml-data CLI, how would gpu worker handle data fetching, would it go straight from disc?

  
  
Posted 3 years ago
Votes Newest

Answers 3


So ideally I should setup one machine with storage for datasets for clearml-server and one GPU machine to be clearml-agent with storage enough to fetch data?

  
  
Posted 3 years ago

Hi ObnoxiousStork61 ,
In general, I think setting up the server on the GPU machine running the experiment is not a good idea - the server is supposed to run in a stable environment, whereas the GPU machine is more dynamic in nature.
Regarding data, since the data is stored in a network storage, and assuming the network storage is local (i.e. in your own network), I don't think data fetching will be an issue...

  
  
Posted 3 years ago

Yes, one machine for server and storage, GPU machine with local storage for data fetching and caching :)

  
  
Posted 3 years ago
985 Views
3 Answers
3 years ago
one year ago
Tags