Hi @<1578918167965601792:profile|DistinctBeetle43> ! This is currently not possible. A different task will be created for each instance
Answered
Hi Everyone,
I'M Using Torch.Distributed For Training On 2 Gpus. It Works, But Each Gpu Creates A New (Duplicated) Task, And I Prefer To Have Only One Clearml Experiment Running. I Looked Here
Hi everyone,
I'm using torch.distributed for training on 2 GPUs. It works, but each GPU creates a new (duplicated) task, and I prefer to have only one ClearML experiment running. I looked here None , but was not able to solve this issue.
Any thoughts? Thanks!
1K Views
2
Answers
one year ago
one year ago
Tags