Yes, i think trains might wrap the torch.load function, but the thing is that i need to load some part of the dataset using torch.load, so this error shows up many time during training, I found i can use this line:task = Task.init(project_name="Alfred", task_name="trains_plot", auto_connect_frameworks={'pytorch': False})
but does it mean i cannot monitor torch.load function any more?
PompousHawk82 unfortunately this is kind of binary, either you have full tracking of load/save operations or you do not.
This warning message will disappear in the next version as we will be able to log multiple models under the same Task :)
Hi PompousHawk82 this is just a message letting you know the second model is documented in the experiment's description section (under INFO/Description in the UI)