
Reputation
Badges 1
9 × Eureka!Thank you for your answer, @<1523701205467926528:profile|AgitatedDove14> !
I've managed to compose a docker with the needed version.
Should I be deploying the entire docker file for every model, with the updated requirements?
Or, can I deploy everything (Prometheus, Grafana, etc.) once and make a serving docker yml for each model with a different port?
Eventually, I want many models (with different package versions) served within a single machine
In Python, I ran the following with many variation of the data:
import requests
from typing import Any
import numpy as np
import json
url = "
"
raw_input = json.dumps([1, 2]) # This becomes a JSON string like "[1, 2]"
response = requests.post(
url=url,
headers={
'accept': 'application/json',
'Content-Type': 'application/json'
},
json=raw_input
)
print(response.status_code, response.json())
And always got:
422 {'detail': "Error ...
I'm mostly with agreement with you on that 🙂
ClearML has a great documentation. But now that I've tried it all, I gave ChatGPT a shot. It got me introduced to the /docs#/, which is kind of a nice way to test the API.
But the issue still remains and I cannot finish my POC 😞
And that is the bottom of the screenshot that I've shared (light green background)
Hi @<1523701070390366208:profile|CostlyOstrich36>
Sorry for the delayed response.
No errors in the serving containers.
I did follow the link you've shared before posting. I ran the following:curl.exe -X POST "
None " -H "accept: application/json" -H "Content-Type: application/json" -d '{"x0": 1, "x1": 2}'
(note that I'm using port 8082, since 8080 is already taken by ClearML Server on this VM).
and got the following response:
...
So, I went to the link None in order to use it like Postman. Testing the API without using Python. It was ChatGPT that directed me there, and it is kind of a nice way to validate the API
Hi John,
thanks for answering.
Maybe I missed anything in the description?
I wrote the commands and the log (console output)
Sorry for the trouble, you are right. Now it looks much better.
Please let me know if it still needs explanation or information
Thank you very much, @<1523701070390366208:profile|CostlyOstrich36> !