Unanswered
[Clearml Serving]
Hi Everyone!
I Am Trying To Automatically Generate An Online Endpoint For Inference When Manually Adding Tag
Hi @<1523701205467926528:profile|AgitatedDove14> ,
Just for verifying which model is actually called by the endpoint when using model auto-update
for automatic model deployment I performed following steps with ClearML Serving PyTorch example :
- I modified the code of
train_pytorch_mnist.py
in thetrain
function withtarget = torch.zeros(data.shape[0]).long()
in order for the model to believe that every image corresponds to a "0". This way this model will always predict "0" and can be easily recognized later on when making inference. - I used this initial model to create the endpoint with
model add
command. - Then, I used the command
model auto-update
to set up automatic model deployment - I removed
target = torch.zeros(data.shape[0]).long()
trick line fromtrain
function oftrain_pytorch_mnist.py
, retrained the model and finally added tag "released" to it. This way, I had a second model that now predicts various numbers and not only "0". - I was then be able, by using the same
curl -X POST
command for inference (i.e.,curl -X POST "
None" -H "accept: application/json" -H "Content-Type: application/json" -d '{"url": "
None"}'
), to see if the endpoint was now taking into account the new model or still the original one predicting only "0". - After waiting more than 20 minutes, I noticed that the value returned by the
curl -X POST
command was still always "0" (see picture below ⤵ ), meaning that the endpoint is still pointing towards the original model and NOT the new one with tag "released"... This actually makes sense (and that's what I was afraid of) sincemodel_id
under "endpoints" section of the Serving Service doesn't get updated and is still the one of the original model (and NOT the latest one with "released" tag).
I guess model auto-update
doesn't work the way I was expecting it to work. Do you have any thoughts on what I could do wrong for automatic model deployment not to work and not being able to use the endpoint with latest model without having to recreate a new endpoint for each latest model?
Thank you again very much for your precious insight! :man-bowing:
188 Views
0
Answers
one year ago
one year ago