ClearML does not officially support a remotely executed task to spawn more tasks we do through pipelines, it that helps you somehow. Note that doing things the way you do them right now might break some other functionality.
Anyway, I will talk with the team and maybe change this behaviour because it should be easy 👍
Kinda, yes, and this has changed with 1.8.1.
The thing is that afaik currently ClearML does not officially support a remotely executed task to spawn more tasks, so we also have a small hack that marks the remote "master process" as a local task prior to anything else.
UnevenDolphin73 looking at the code again, I think it is actually correct. it's a bit hackish, but we do use deferred_init as an int internally. Why do you need to close the task exactly? Do you have a script that would highlight the behaviour change between <1.8.1 and >=1.8.1 ?
True, and we plan to migrate to pipelines once we have some time for it :) but anyway that condition is flawed I believe
Internally yes, but in Task.init the default argument is a boolean, not an int.
We don't want to close the task, but we have a remote task that spawns more tasks. With this change, subsequent calls to Task.init fail because it goes in the deferred init clause and fails on validate_defaults .
I see. We need to fix both anyway, so we will just do that
Hi UnevenDolphin73 ! We were able to reproduce the issue. We'll ping you once we have a fix as well 👍
So now we need to pass Task.init(deferred_init=0) because the default Task.init(deferred_init=False) is wrong
SmugDolphin23 I think you can simply change not (type(deferred_init) == int and deferred_init == 0) to deferred_init is True ?
I narrowed the bug down to the "fix" in 1.8.1, see my other post
Hi UnevenDolphin73
Does ClearML somehow
remove
any loggers from
logging
module? We suddenly noticed that we have some handlers missing when running in ClearML
I believe it adds a logger, it should not remove any loggers,
What's the clearml version you are using ?
UnevenDolphin73 looks like we clear all loggers when a task is closed, not just clearml ones. this is the problem
UnevenDolphin73 did that fix the logging for you? doesn't seem to work on my machine. This is what I'm running:
` from clearml import Task
import logging
def setup_logging():
level = logging.DEBUG
logging_format = "[%(levelname)s] %(asctime)s - %(message)s"
logging.basicConfig(level=level, format=logging_format)
t = Task.init()
setup_logging()
logging.info("HELLO!")
t.close()
logging.info("HELLO2!") `
But it is strictly that if condition in Task.init, see the issue I opened about it
1.8.3; what about when calling task.close() ? We suddenly have a need to setup our logging after every task.close() call
We suddenly have a need to setup our logging after every
task.close()
Hmm that gives me a handle on things, any chance it is easily reproducible ?
Oh, well, no, but for us that would be one way solution (we didn't need to close the task before that update)
So the flow is like:MASTER PROCESS -> (optional) calls task.init -> spawns some children CHILD PROCESS -> calls Task.init. The init is deferred even tho it should not be?
If so, we need to fix this for sure