Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
How Can I Ensure Tasks In A Pipeline Have The Same Environment As The Pipeline Itself? It Seems A Bit Counter-Intuitive That The Pipeline (Executed Remotely) Captures The Local Environment, But The Tasks (Executed Remotely) Do Not Use That Same Environmen

How can I ensure tasks in a pipeline have the same environment as the pipeline itself? It seems a bit counter-intuitive that the pipeline (executed remotely) captures the local environment, but the tasks (executed remotely) do not use that same environment?

  
  
Posted one year ago
Votes Newest

Answers 42


Hey @<1523701205467926528:profile|AgitatedDove14> , thanks for the reply!

We would like to avoid dockerizing all our repositories. And for the time being we have not used the decorators, but we can do that too.
The pipeline is instead built dynamically at the moment.

The issue is that the components do not have their dependency. For example:

def step_one(...):
    from internal.repo import private
    # do stuff

When step_one is added as a component to the pipeline, it does not include the “internal.repo” as a package dependency, so it crashes.

  
  
Posted one year ago

Hi @<1523701083040387072:profile|UnevenDolphin73>

How can I ensure tasks in a pipeline have the same environment as the pipeline itself?
...
but the tasks (executed remotely) do not use that same environment?

Just verifying, we are talking about pipeline decorators?

We also wanted this, we preferred to create a docker image with all we need, and let the pipeline steps use that docker image

You can specify the docker on the decorator itself:
None
Regrading capturing the packages, if you import them inside the decorated package, they will be captured based on what is installed in the local (i.e. initial) environment. The idea is that the components are Not the same as the logic, basically the logic of the pipeline should not have any real package requirement, only the components (actually doing something), should. What am I missing ?

  
  
Posted one year ago

Still; anyone? 🥹 @<1523701070390366208:profile|CostlyOstrich36> @<1523701205467926528:profile|AgitatedDove14>

  
  
Posted one year ago

Pinging about this still, unresolved 🤔

ClearML does not capture our internal libraries and so our functions (pipeline steps) crash with missing modules.

  
  
Posted one year ago

We’d be happy if ClearML captures that (since it uses e.g. pip, then we have the git + commit hash for reproducibility), as it claims it would 😅

Any thoughts CostlyOstrich36 ?

  
  
Posted one year ago

not sure about this, we really like being in control of reproducibility and not depend on the invoking machine… maybe that’s not what you intend

  
  
Posted one year ago

It’s just that for the packages argument, ClearML says:

If not provided, packages are automatically added based on the imports used inside the wrapped function.

So… 🤔

  
  
Posted one year ago

PricklyRaven28 That would be my fallback, it would make development much slower (having to build containers with every small change)

  
  
Posted one year ago

We also wanted this, we preferred to create a docker image with all we need, and let the pipeline steps use that docker image

That way you don’t rely on clearml capturing the local env, and you can control what exists in the env

  
  
Posted one year ago

We have an internal mono-repo and some of the packages are required - they’re all available correctly for the controller, only some are required for the individual tasks, but the “magic” doesn’t happen 😞
That is, the controller does not identify them as a requirement, so they’re not installed in the tasks environment.

  
  
Posted one year ago

Well the individual tasks do not seem to have the expected environment.

  
  
Posted one year ago

Hi UnevenDolphin73 , when you say pipeline itself you mean the controller? The controller is only in charge of handling the components. Lets say you have a pipeline with many parts. If you have a global environment then it will force a lot of redundant installations through the pipeline. What is your use case?

  
  
Posted one year ago