Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Am Using Pipelinedecorator To Create Tasks. Is There A Way To Force It To Use The Entire Git Repo It Is Created From On The Pythonpath? Vs. Just The Decorated Function And Perhaps The Helper_Function=[Some_Function]?

Hi, I am using PipelineDecorator to create tasks.
is there a way to force it to use the entire git repo it is created from on the pythonpath?
vs. just the decorated function and perhaps the helper_function=[some_function]?

  
  
Posted one year ago
Votes Newest

Answers 8


sure CostlyOstrich36
I have something like the following:

@PipelineDecorator.component(....) def my_task(...) from my_module1 import my_func1 from my_modeul2 import ....my_module1 and 2 are modules that are a part of the same project source. they don’t come as a separate package.

Now when I run this in clearml, these imports don’t work.

These functions may require transitive imports of course, so the following doesn’t work:
PipelineDecorator.component(helper_function=[my_func1])
Even when I add the repo:
@PipelineDecorator.component(repo="..")The imports are not recognized - they are not on the pythonpath of the task that the agent starts.

My current workaround is to run it in a docker, and add an init script that does something like this:

export PYTHONPATH=${PYTHONPATH}:/root/.clearml/venvs-builds/task_repository/my_repo.git/src exec "$@"

  
  
Posted one year ago

@PipelineDecorator.component(repo="..")

The imports are not recognized - they are not on the pythonpath of the task that the agent starts.

RoughTiger69 add the imports inside the functions itself, you cal also specify the, on the component
@PipelineDecorator.component(..., package=["pcakge", "package==1.2.3"])or
@PipelineDecorator.component(...): import pandas as pd # noqa ...

  
  
Posted one year ago

they are just neighboring modules to the function I am importing.

So I think that is you specify the repo,, on the remote machine you will end with the code of the component sitting at the root folder of the repo, from there I assume you can import the rest, the root git path should be part of your PYTHONPATH automatically.
wdyt?

  
  
Posted one year ago

AgitatedDove14 the emphasis is that the imports I am doing are not from external/pipe packages, they are just neighbouring modules to the function I am importing. Imports that rely on pip installed packages work well

  
  
Posted one year ago

RoughTiger69 , can you please elaborate a bit on your use case?

  
  
Posted one year ago

AgitatedDove14

the root git path should be part of your PYTHONPATH automatically

That’s true but it doesn’t respect the root package (sources root or whatever).
i.e. if all my packages are runder /path/to/git/root /src/
So I had to add it explicitly via a docker init script…

  
  
Posted one year ago

So I had to add it explicitly via a docker init script

Oh yes, that makes sense, can't think of a better hack other than sys.path.append(os.path.join(os.path.dirname(__file__), "src"))

  
  
Posted one year ago

and of course this solution forces me to do a git push for all the other dependent modules when creating the task…

  
  
Posted one year ago
647 Views
8 Answers
one year ago
one year ago
Tags