Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Is There An Auto Scaling Solution For Gcp Users?

Is there an auto scaling solution for GCP users?

  
  
Posted 4 years ago
Votes Newest

Answers 6


Hi WackyRabbit7

Currently we don’t have a GCP auto scalar. We’re more than happy to get contributions for GCP and other platforms.

The AWS auto-scaler code is pretty generic and in order to be used for GCP, you need to implement a  GCPAutoScaler  similar to  trains.automation.aws_auto_scaler.AWSAutoScaler  , which basically has  spin_up_worker()  and  spin_down_worker()  methods...

  
  
Posted 4 years ago

This is a nice issue to open in https://github.com/allegroai/trains :)

  
  
Posted 4 years ago

Ah ok ! I think the link between clearml agent queues and the autoscaler is also important for resource monitoring etc. Sorry I'm quite new to clearml so I'm still trying to understand the architecture 🙈 . Although AWS Batch has its own queue system too which it uses to allocate jobs on training instances but its not customizable

  
  
Posted 3 years ago

Hi BrightElephant64 , can you add an example? Also, the ClearML AWS autoscaler know how to work with ClearML-agent queues

  
  
Posted 3 years ago

Hi  TimelyPenguin76 ! quick question - I was curious if at any point in time ClearML had considered using AWS Batch for the autoscaling part ? As submitting "training jobs" to AWS batch would create/terminate ec2 instances automatically too instead of clearml writing logic to spin up/down instances

  
  
Posted 3 years ago

I'd be lying if I said I had time for that 🙂

  
  
Posted 4 years ago
1K Views
6 Answers
4 years ago
one year ago
Tags