found the problem, someone hardcoded "~/trains.conf" in one experiment yaml, so even with fresh clearml install it was searching for that instead of clearml.conf 🥲
Answered
Hi Everyone,
I'M Getting Really Weird Issue With Clearml Installation. Basically, I Have An Environment Where I Already Configured Clearml Via Clearml-Init, But Inside Code I'M Running On Slurm My Task.Init() Raises "Missingconfigerror".
Worth Mentioning,
Hi everyone,
I'm getting really weird issue with clearml installation. Basically, I have an environment where I already configured clearml via clearml-init, but inside code I'm running on slurm my Task.init() raises "MissingConfigError".
Worth mentioning, using Task.init() in jupyter doesn't raise that error, and I run notebook on the same slurm server with same environment. (I checked environments with os.environ(["CONDA_PREFIX"]) both inside my code and inside my jupyternotebook just before Task.init() is called, they are the same)
Any idea, what's going on and how can I debug this? Maybe there is a way in clearml to display path to clearml.conf, so I can check where it is looking for the config when called inside my code?
675 Views
2
Answers
11 months ago
11 months ago
Tags