Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello,

Hello,
We have an existing EKS cluster. So I'm wondering if I should deploy ClearML on the EKS cluster, or deploy on EC2 using AMI. Is there an advantage of one over the other? We have a pipeline that need to run once a month to train a model. Is there an scheduler option we can config to enqueue the pipeline once a month? (It look like the Pro plan has task scheduling and pipeline trigger. If I run self-hosted version, is there someway to config this?) I see ClearML now have AWS Autoscaler, but it's only available for the Pro plan? So if I self-host clearml server, it won't work?Thank you

  
  
Posted 2 years ago
Votes Newest

Answers 6


Thank you, John and Jake. As I understand it, the deployment only deploy the clearml server and not the agents. So I'm a bit u nclear when said k8s is more scalable? Does the clearml server need to scale up and down? Or do you mean k8s deployment will have easier time spin up agent instances to run the tasks? SuccessfulKoala55

  
  
Posted 2 years ago

GrittyCormorant73 , I agree with CostlyOstrich36 . Additionally, k8s is also more scalable looking forward if you ever want to move the different ClearML server parts to clusters (like ES/MongoDB etc.) which is certainly not needed at first.

  
  
Posted 2 years ago

Hi PunyWoodpecker71 ,

Regarding your questions:
We have an existing EKS cluster. So I'm wondering if I should deploy ClearML on the EKS cluster, or deploy on EC2 using AMI. Is there an advantage of one over the other?I think it's a matter of personal preference. Maybe SuccessfulKoala55 , can add some information.
We have a pipeline that need to run once a month to train a model. Is there an scheduler option we can config to enqueue the pipeline once a month? (It look like the Pro plan has task scheduling and pipeline trigger. If I run self-hosted version, is there someway to config this?)In the examples folder of clearml repository you have an example for a scheduler
https://github.com/allegroai/clearml/tree/master/examples/scheduler
I see ClearML now have AWS Autoscaler, but it's only available for the Pro plan? So if I self-host clearml server, it won't work?In the PRO plan the Autoscaler has a UI. In the open source you also have an example on that:
https://github.com/allegroai/clearml/tree/master/examples/services/aws-autoscaler

  
  
Posted 2 years ago

GrittyCormorant73 , K8s deployment will have easier time to spin up agent instances to run the tasks 🙂

  
  
Posted 2 years ago

SuccessfulKoala55 Thank you, I forgot that I can also install clearml agent with helm chart. But after I run helm install first-agent allegroai/clearml-agent it doesn't ask me for any config. Does it know to connect the the clearml server in the same cluster? Do you know any tutorial I can look at for setting up clearml server and agent on k8s? Thank you

  
  
Posted 2 years ago

PunyWoodpecker71 ,
We have an existing EKS cluster. So I'm wondering if I should deploy ClearML on the EKS cluster, or deploy on EC2 using AMI. Is there an advantage of one over the other?This is really a matter pf preference. You have an AMI, docker-compose and helm chart deployments available for ClearML, and while the first two are basically the same (the AMI is just preinstalled, and the installation itself is not very complicated), the helm chart (k8s) is obviously much more scalable - however it does require k8s knowledge and proficiency. I personally would start from AMI/docker-compose and move to helm chart/k8s over time (moving the actual data is fairly easy in the long run).

  
  
Posted 2 years ago
1K Views
6 Answers
2 years ago
one year ago
Tags