AgitatedDove14
I do believe triggers should be unique somehow because I find them way too easy to mishandle. Especially if used with schedule_function
which is defined in the same script. Updating that function requires deleting the existing trigger task first and recreating it. If not done like this you just end up with 2 trigger tasks with the same name which I assume will respond to the same event(s) but do something slightly different in response. I assume it might work like this because I did not actually try it. I don't think an event can be consumed only by one consumer. I guess what I would expect to happen when I run the py script with an updated schedule_function
is only for that function to be updated in place. But I understand that since this is also a clearml task, like everything else, it does need to play by the same rules. I know you can have a TriggerScheduler
defined and add multiple task_triggers
to it and all that will run in a single pod. Is it possible to have multiple TriggerScheduler
defined in the same script with start_remotely
and all the TriggerScheduler
will run on the same pod?