Unanswered
Hello, How Do You Manage To Unload A Model From Clearml-Serving Api?
I Am Trying To Unload A Model Through Grpc Via
Hi @<1683648242530652160:profile|ApprehensiveSeaturtle9>
I send a request to the endpoint but never unload (the gpu memory keep increasing when I infer with a new model).
They are not unloaded after the request is done. see discussion here: None
You can however remove the model from the serving session (but I do not think this is what you meant)
I'm assuming you want to run multiple models on a single GPU with not enough memory ?
59 Views
0
Answers
5 months ago
5 months ago