Do I have to run a task individually completely for it to function properly in a pipeline?
It was getting logged when I was running it as an individual task but it is not getting logged in the pipeline. I was plotting loss graphs too but now they're not getting plotted.
My models are not getting saved in the .pipeline folder. They are not getting saved in the output_uri specified in the Task.init as well.
Hi PerfectMole86 ,
How are you saving your models? How are they being saved, under .pipeline folder as well?
I got this message when the training started and it is only saving the model locally
I have used a output_uri argument in my Task initialization for storing my models
I haven't pointed it to the file server because I'm running a locally deployed docker instance of ClearML
This is how I am defining the task in the code
How is the model being saved/logged into clearml?
Not on the NAS storage but on my PC where the training is running
I mean code wise. Also where is it saved locally?
Is output_uri
defined for both steps? Just making sure.
What if you point it to the fileserver? Does it still not upload the model?