Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
DefiantHippopotamus88
Moderator
2 Questions, 55 Answers
  Active since 10 January 2023
  Last activity 12 days ago

Reputation

0

Badges 1

51 × Eureka!
0 Votes
18 Answers
496 Views
0 Votes 18 Answers 496 Views
Is it possible to create a serving endpoint with Pytorch JIT file in web interface only?
one year ago
0 Votes
60 Answers
166 Views
0 Votes 60 Answers 166 Views
Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed cl...
one year ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

In my understanding requests still go through clearml-server which configuration I left intact. Maybe due to the port change in clearml-serving I need to adjust smth.

one year ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

curl -X POST " " -H "accept: application/json" -H "Content-Type: application/json" -d '{"url": " "}' curl: (56) Recv failure: Connection reset by peer

one year ago
0 Hey,

Let's maybe join our effort to get local deployment?

one year ago
0 Hey,

at least no more auth errors

one year ago
one year ago
0 Hi, I Try To Run Locally

does it work for you?

one year ago
0 Hi, I Try To Run Locally

yeah, ok
but it didn't

one year ago
0 Hi, I Try To Run Locally

I got only smth like this:

clearml-serving-triton        | I0701 08:32:58.580705 46 server.cc:250] Waiting for in-flight requests to complete.
clearml-serving-triton        | I0701 08:32:58.580710 46 server.cc:266] Timeout 30: Found 0 model versions that have in-flight inferences
clearml-serving-triton        | I0701 08:32:58.580713 46 server.cc:281] All models are stopped, unloading models
clearml-serving-triton        | I0701 08:32:58.580717 46 server.cc:288] Timeout 30: Found 0 live ...
one year ago
0 Hi, I Try To Run Locally

I haven't followed it so closely, but let me check

one year ago
0 Hi, I Try To Run Locally

can you share your log items?

one year ago
0 Hi, I Try To Run Locally

I'm on Ubuntu

one year ago
0 Hi, I Try To Run Locally

except access_key of course, they should be yours

one year ago
0 Hi, I Try To Run Locally

you should also use my example.env

one year ago
0 Hi, I Try To Run Locally

how d you start a docker-compose?

docker-compose --env-file example.env -f docker-compose.yml up -d
one year ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

curl -X POST " " -H "accept: application/json" -H "Content-Type: application/json" -d '{"url": " "}' {"detail":"Error processing request: Error: Failed loading preprocess code for 'py_code_test_model_pytorch2': 'NoneType' object has no attribute 'loader'"}

one year ago
0 Hi, I Try To Run Locally

it suppose to have access_key and secret_key which should correspond to this file

one year ago
0 Hi, I Try To Run Locally

but it actually looks ok

one year ago
0 Hi, I Try To Run Locally

that's strange, maybe you should upgrade it

one year ago
0 Hi, I Try To Run Locally

I can make a PR if it works

one year ago
0 Hi, I Try To Run Locally

the way above works for me

one year ago
0 Hey,

it worked for me with one docker-compose for all

one year ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

I made it working with full port reassignment to 9090 in clearml-serving-inference
which still send me an error that the format of my request is somehow wrong
but then I started from scratch by creating completely new project and new endpoint

one year ago
0 Hi, I Try To Run Locally

I have to step away for a couple of hours
please let me know if you find something wrong

one year ago
0 Hi, I Try To Run Locally

maybe I'm missing something with credentials?

one year ago
0 Hi, I Try To Run Locally

same thing

clearml-serving-inference     | Retrying (Retry(total=236, connect=236, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f899dc4e8b0>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login
one year ago
one year ago
one year ago
0 Hi, I Try To Run Locally

will continue tomorrow

one year ago
0 Hi, I Try To Run Locally

doesn't work anyway

one year ago
Show more results compactanswers