Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello, A Question About Pipelines. I Have A Repository With One Pipeline Using Decorators, Defined In

Hello, a question about pipelines. I have a repository with one pipeline using decorators, defined in pipeline.py . It uses multiple components that import code from other parts of the same repository. When I execute the pipeline remotely in Kubernetes, those components don’t clone the actual repository, only the controller task does. I have to specify repo information in PipelineDecorator.component manually for repository to be cloned. Is this expected?

  
  
Posted 6 months ago
Votes Newest

Answers 9


when I add repo="." to definition of all my component decorators it works (but not the pipeline decorator), but it doesn’t work without that part… the problem i’m having now is that my components hang when executed in the cluster… i have 2 agents deployed (default and services queues)

  
  
Posted 6 months ago

Glad to hear that, indeed an odd issue... is this reproducible i.e. can we get something to fix it?

  
  
Posted 6 months ago

apologies @<1798887585121046528:profile|WobblyFrog79> somehow I missed your reply,

My workflow is based around executing code that lives in the same repository, so it’s cumbersome having to specify repository information all over the place, and changing commit hash as I add new code.

It automatically infers the repo if the original as long as the pipeline code itself is inside the repo, by that I mean the pipeline logic, when you run it the first time (think development etc), if it sits inside the repo it should auto detect it, if for some reason it does not work, try to pass repo="." to imply you have a local of the repo in the same folder, and from there it will autoi detect the repo detials. does that make sense ?

  
  
Posted 6 months ago

Hi @<1798887585121046528:profile|WobblyFrog79>

. When I execute the pipeline remotely in Kubernetes, those components

two things, one, make sure you specify the repo you need the components from in the decorator function, what will happen is the repo will be cloned into the container running on k8s, then inside the repo root your script (i.e. pipeline step) will be running.
None
second thing, if you need any of the other functions in the pipeline file, list then under helper_functions
None

  
  
Posted 6 months ago

I think so, but haven’t investigated what is the problem exactly, I’ll report it though.

  
  
Posted 6 months ago

the components start hanging indefinitely right after printing Starting Task Execution

  
  
Posted 6 months ago

no worries @<1523701205467926528:profile|AgitatedDove14>

  
  
Posted 6 months ago

Huh, I see. Thanks for your answers. How difficult would it be to implement some way to automatically inferring repository information for components, or having a flag repo_inherit (or similar) when defining a component (which would inhering repository information from the controller)? My workflow is based around executing code that lives in the same repository, so it’s cumbersome having to specify repository information all over the place, and changing commit hash as I add new code.

  
  
Posted 6 months ago

@<1523701205467926528:profile|AgitatedDove14> I managed to fix the issue FYI. I replaced from clearml import PipelineDecorator with from clearml.automation.controller import PipelineDecorator and it suddenly works. What a weird issue.

  
  
Posted 6 months ago