Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
DefiantHippopotamus88
Moderator
2 Questions, 55 Answers
  Active since 10 January 2023
  Last activity 8 months ago

Reputation

0

Badges 1

51 × Eureka!
0 Votes
18 Answers
987 Views
0 Votes 18 Answers 987 Views
Is it possible to create a serving endpoint with Pytorch JIT file in web interface only?
2 years ago
0 Votes
60 Answers
25K Views
0 Votes 60 Answers 25K Views
Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed cl...
2 years ago
0 Hi, I Try To Run Locally

does it work for you?

2 years ago
0 Hi, I Try To Run Locally

except access_key of course, they should be yours

2 years ago
0 Hi, I Try To Run Locally

oh, I see one error, let me check fast

2 years ago
0 Hi, I Try To Run Locally

I don't thing WEB_HOST is important, but what about FILE_HOST?
do I need to change it accordingly?

2 years ago
0 Hi, I Try To Run Locally

you are right, for some reason it doesn't resolve inside a container

root@dd0252a8f93e:~/clearml# curl 

curl: (7) Failed to connect to localhost port 8008: Connection refused
root@dd0252a8f93e:~/clearml# curl 

curl: (7) Failed to connect to 127.0.0.1 port 8008: Connection refused
root@dd0252a8f93e:~/clearml# 
2 years ago
0 Hi, I Try To Run Locally

I tried that, it didn't work. I was confused by the separate port parameter:

CLEARML_SERVING_PORT: ${CLEARML_SERVING_PORT:-8080}

which is only one port related in docker-compose-triton.yml
Can I test /auth.login somehow independently? Using curl or any other way. Which address does it suppose to have and which creds should I use?

2 years ago
0 Hi, I Try To Run Locally

yeah, ok
but it didn't

2 years ago
0 Hi, I Try To Run Locally

same thing

clearml-serving-inference     | Retrying (Retry(total=236, connect=236, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f899dc4e8b0>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login
2 years ago
0 Hi, I Try To Run Locally

my clearml.conf

api { 
    web_server: 

    api_server: 

    files_server: 

    # test 3
    credentials {
        "access_key" = "91SFEX4BYUQ9YCZ9V6WP"
        "secret_key" = "4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
    }
}
2 years ago
0 Hi, I Try To Run Locally

but it actually looks ok

2 years ago
0 Hi, I Try To Run Locally

seems like an issue about 2 compose apps using different networks which are not accessible from each other
I wonder if I just need to join 2 docker-compose files to run everything in one session

2 years ago
0 Hi, I Try To Run Locally

can you share your log items?

2 years ago
0 Hi, I Try To Run Locally

I can make a PR if it works

2 years ago
0 Hi, I Try To Run Locally

you should also use my example.env

2 years ago
0 Hi, I Try To Run Locally

doesn't work anyway

2 years ago
0 Hi, I Try To Run Locally

yeah, I tried the following
None
but haven't managed yet to make it work

2 years ago
0 Hi, I Try To Run Locally

do I need to change anything else?

2 years ago
2 years ago
0 Hi, I Try To Run Locally

I changed port here:

clearml-serving-inference:
    image: allegroai/clearml-serving-inference:latest
    container_name: clearml-serving-inference
    restart: unless-stopped
    ports:
      - "9090:8080"
2 years ago
0 Hey,

Let's maybe join our effort to get local deployment?

2 years ago
0 Hey,

hi WickedElephant66
I have the same issue, but port is not the only problem
https://clearml.slack.com/archives/CTK20V944/p1656446563854059

2 years ago
0 Hi, I Try To Run Locally

it suppose to have access_key and secret_key which should correspond to this file

2 years ago
2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

How can I clean database or whatever to get to the beginning?

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

In my understanding requests still go through clearml-server which configuration I left intact. Maybe due to the port change in clearml-serving I need to adjust smth.

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

clearml-serving-inference | 2022-07-03 22:06:26,893 - clearml.storage - ERROR - Could not download , err: HTTPConnectionPool(host='localhost', port=8081): Max retries exceeded with url: /DevOps/serving%20example%2010.0a76d264e30940c2b600375fa839f1a2/artifacts/py_code_test_model_pytorch2/preprocess.py (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fc3f41b1790>: Failed to establish a new connection: [Errno 111] Connection refused'))

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

I tried, step by step from here
https://github.com/allegroai/clearml-serving/tree/main/examples/pytorch
and result is the same.
Then I tried to remove my old serving examples to start checking from scratch by the immediately restart after stopping

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

basicaly I don't want to train a new model and I try to create an endpoint following the example but I finally get
$ curl -X POST " " -H "accept: application/json" -H "Content-Type: application/json" -d '{"url": " ` "}'

<html> <head><title>405 Not Allowed</title></head> <body> <center><h1>405 Not Allowed</h1></center> <hr><center>nginx/1.20.1</center> </body> </html> `

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

curl -X POST " " -H "accept: application/json" -H "Content-Type: application/json" -d '{"url": " "}' {"detail":"Error processing request: Error: Failed loading preprocess code for 'py_code_test_model_pytorch2': 'NoneType' object has no attribute 'loader'"}

2 years ago
Show more results compactanswers