Hi @<1523701295830011904:profile|CluelessFlamingo93> , I think you need this module as a part of repository, otherwise, how will the pipeline know what code to use?
@<1523701087100473344:profile|SuccessfulKoala55> and @<1523701070390366208:profile|CostlyOstrich36> , in the end I've found the problem, it was due to me running the pipeline locally and when running the pipeline locally it, doesn't copy all the dir but only the script that is running None
@<1523701295830011904:profile|CluelessFlamingo93> is the working dir set to the correct path? (Where the train.py file is)
@<1523701070390366208:profile|CostlyOstrich36> my repo is like this and both the files are located at the same dir so its weird that they cannot find train:
.
├── pytorch
├── tensorflow
│ ├── Project A
│ │ └── src
│ ├── Project B
│ │ ├── data
│ │ ├── model
│ │ ├── reports
│ │ └── utils
│ ├── hand_validator_boxes
│ │ ├── src
│ │ ├── train.py (the module i need)
│ │ └── clearml_pipeline.py (where the pipeline is initilizied
└── utils
Hi @<1523701070390366208:profile|CostlyOstrich36> , it is part of the repository, do pipelines run differently then normal tasks? what I mean is when i run a task it has a working directory do pipelines also have that or are their working directory the root of the repo?
@<1523701295830011904:profile|CluelessFlamingo93> , I'm not sure what you mean. Whenever you run a pipeline code (pipeline from decorators) if it's from a repository that repo will be logged. Where are you importing "train" from? What if you import entire package & point to the specific module?
@<1523701087100473344:profile|SuccessfulKoala55> yes the working dir is set to the correct path and yet it cannot import the train module