Basically want the model to be uploaded to the server alongside the experiment results.
In your ~/clearml.conf
you can specify the following to force the model to upload with the following setting:sdk.development.default_output_uri
tensorflow model.save, it says the model locally in saved model format.
After the previous code, I got the model uploaded by the previous code using its ID. Now when I add tags here, they were visible in the UI
VexedCat68 , it looks like it is being saved locally. Are you running all from the same machine?
CostlyOstrich36 I'm observing some weird behavior. Before when I added tags to the model before publishing it, it worked fine and I could see the tags in the UI.
Now when I do it this way, tags aren't set. If I then run another code which gets the model, using ID, and then set tags, it worked fine. Let me share the codes.
For anyone facing a similar issue to mine and wanting the model to uploaded just like data is uploaded,
in the Task.init, set the output_uri = True.
This basically makes it use the default file server for clearml that you define in the clearml.conf file. Ty.
Basically saving a model on the client machine and publishing it, then trying to download it from the server.
Thank you, I found the solution to my issue, when I started reading at default output uri.
Can you access the model in the UI and see the uri there?
Since I want to save the model to the clearml server? What should the port be alongside the url?
And in that case, if I do, model.save('test'), it will also save the model to the clearml server?
The server is on a different machine. I'm experimenting on the same machine though.
I basically go to the model from the experiment first, then when in the model, I'm trying to download it but can't. I've screenshotted the situation.