Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Recently Upgraded My Clearml Version And Here Are Some Questions That I Have:

I recently upgraded my clearml version and here are some questions that I have:

  • Is there an autoscaler available for Azure ?
  • I'm interested in the clearml-serving functionnalities, would that be suitable for real-time inference on arm64 devices ?
  
  
Posted 7 months ago
Votes Newest

Answers 7


ah, okay that make sense, I'll look more into the difference between pro / enterprise. Thanks for the info !

  
  
Posted 7 months ago

Then it's the community server, that is not an enterprise version. In the PRO version only AWS/GCP autoscalers are available.

  
  
Posted 7 months ago

I mean I'm hosting it myself, it's on app.clear.ml

  
  
Posted 7 months ago

What is the address of your server?

  
  
Posted 7 months ago

also, I see that clearml-serving support pytorch, is there any chance for support for TensorRT ?

  
  
Posted 7 months ago

Thanks for the reply ! I am using the enterprise version, do you have a link to some docs for the autoscaler ? On the orchestration tab I can see AWS and GCP but not Azure. (also, I was previously able to see Clearml GPUs, but it looks like they're not available anymore ?)

  
  
Posted 7 months ago

Hi @<1644147961996775424:profile|HurtStarfish47> , to answer your questions:

  • Is there an autoscaler available for Azure ?I'm afraid not in the self hosted version. An Azure autoscaler is available only in the Scale/Enterprise licenses.
  • I'm interested in the clearml-serving functionnalities, would that be suitable for real-time inference on arm64 devices ?Yes 🙂
  
  
Posted 7 months ago
478 Views
7 Answers
7 months ago
7 months ago
Tags