Hi @<1620955143929335808:profile|PleasantStork44> , currently this is not possible directly. However if you have a task with scalars, you can theoretically get all task events and resend them for the new task (although this is not part of the SDK official function interface and requires internal knowledge of the SDK implementation)
@<1523701087100473344:profile|SuccessfulKoala55> I also encountered a problem with connect_configuration(), I pass it a dict at the beginning of training, during training I can see my configs in "General" , but once the pytorch model saves the checkpoints , the configs are gone!
or just copy scalars from an old task to a new one ( I want to complete training from previous checkpoints, and I need to create a new task and move the scalars of the old iterations to the new one first)