If so, yes. Which example code exactly?
I am running the example code, so i guess its running on my local machine?
Yes, but how do you plan to run the inference?
SuccessfulKoala55 i use the free-tier hosting
ApprehensiveRaven81 do you mean clearml-serving? Where do you run the serving deployment?
CostlyOstrich36 ClearML offeres a free tier server, right? My question is
- Can I deploy to this server? I.e use hardware from this server instead of from my machine.
- If so, when i do deploy on ClearML server, how can i get a public url to run inference?
Hi ApprehensiveRaven81 , I'm not sure what you mean. Can you please elaborate?