@<1523701295830011904:profile|CluelessFlamingo93> I'm not sure I understand what you're trying to do - in order to run a pipeline, you need to define it somehow. You can build a pipeline that will import your existing modules and each defined step will call one of your functions - is that what you're tyring to do?
Hi @<1523701295830011904:profile|CluelessFlamingo93> , you can use pipelines from functions or from decorators also
when i tried doing with the decorators it threw me an error that it cannot run task init in side a working task (the pipe lines task)
@<1523701087100473344:profile|SuccessfulKoala55> What I'm trying to do is connect 3 different tasks into 1 pipeline but still being able to run each task as an individual when needed but without changing the tasks code. for example i have a training.py file which runs task.init in the start and creates a task in the server for training a new model, but i want also to create a pipeline that will run that training.py and other tasks together, is that more clear now?
Hi @<1523701295830011904:profile|CluelessFlamingo93> , what is the exact flow you're considering?
The flow is: Training.py (which creates and runs a training task) -> conversion_task.py (converts the outputs of the models into a format of our choosing) -> testing.py (testing the model after conversion).
I tried using the decorators and fucntions but they both threw me errors that i cannot do task init in side a running task.
Well, I guess you can try using Task.current_task()
and checking the return value - it it's None, it should be safe to do Task.init