Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey, What Is The Recommended Approach To Speed Up The Spin Up Of A Task In A Gcp Autoscaled Instance ? It Takes 20Mins To Build The Venv Environment Needed By The Clearml-Agent To Run It, Would Providing A Vm Image With Preinstalled Pip Packages On It Hel

Hey, what is the recommended approach to speed up the spin up of a Task in a GCP autoscaled instance ? It takes 20mins to build the venv environment needed by the clearml-agent to run it, would providing a VM image with preinstalled pip packages on it help ?

  
  
Posted 2 years ago
Votes Newest

Answers 13


It takes 20mins to build the venv environment needed by the clearml-agent

You are Joking?! 😭
it does apt-get install python3-pip , and pip install clearml-agent, how is that 20min?

  
  
Posted 2 years ago

It takes about 30 seconds here for that step

  
  
Posted 2 years ago

Hi FierceHamster54

Could you provide an example and we will try to reproduce it?

  
  
Posted 2 years ago

Well I think most of the time is took by the setup of the venv installing the packages defined in the imports in the pipeline component which is normal and some of those package have a wheel that takes a long time to build but most of those packages where already included on the Docker image I provided and I get that message in my logs:

:: Python virtual environment cache is disabled. To accelerate spin-up time setagent.venvs_cache.path=~/.clearml/venvs-cache:::

  
  
Posted 2 years ago

Most of the time is took by building wheels for nympy and pandas which are apparently deps of clearml-agent if I read the log correctly

  
  
Posted 2 years ago

Does it take the same amount of time if you build it locally?

  
  
Posted 2 years ago

or in another environment?

  
  
Posted 2 years ago

Well having a network inbcidient at HQ so this doesn't help.... but I'll keep you updqted with the tests I run tommorow

  
  
Posted 2 years ago

sure
I hope that network incident is not too big 🙂

  
  
Posted 2 years ago

FierceHamster54 what you are saying that Inside the container it took 20 min to run? or that spinning the GCP instance until it registered as an Agent took 20min ?

Most of the time is took by building wheels for

nympy

and

pandas

...

BTW: This happens if there is a version mismatch and pip decides it needs to build the numpy from source, Can you send the full logs of that? Maybe we can somehow avoid that?

  
  
Posted 2 years ago

AgitatedDove14 Here you go, I think it's inside the container since it's after the worker pulls the image

  
  
Posted 2 years ago

I think it's inside the container since it's after the worker pulls the image

Oh that makes more sense, I mean it should not build the from source, but make sense
To solve for build for source:
Add to the "Additional ClearML Configuration" section the following line:
agent.package_manager.pip_version: "<21"
You can also turn on venv caching
Add to the "Additional ClearML Configuration" section the following line:
agent.venvs_cache.path: ~/.clearml/venvs-cache
I will make sure we bump the minimum pip by default so this does not happen

  
  
Posted 2 years ago

Thanks for your reactivity 🎉

  
  
Posted 2 years ago
1K Views
13 Answers
2 years ago
one year ago
Tags