Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
DefiantHippopotamus88
Moderator
2 Questions, 55 Answers
  Active since 10 January 2023
  Last activity 7 months ago

Reputation

0

Badges 1

51 × Eureka!
0 Votes
18 Answers
926 Views
0 Votes 18 Answers 926 Views
Is it possible to create a serving endpoint with Pytorch JIT file in web interface only?
2 years ago
0 Votes
60 Answers
23K Views
0 Votes 60 Answers 23K Views
Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed cl...
2 years ago
0 Hi, I Try To Run Locally

I haven't followed it so closely, but let me check

2 years ago
0 Hi, I Try To Run Locally

you should also use my example.env

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

I made it working with full port reassignment to 9090 in clearml-serving-inference
which still send me an error that the format of my request is somehow wrong
but then I started from scratch by creating completely new project and new endpoint

2 years ago
0 Hi, I Try To Run Locally

does it work for you?

2 years ago
0 Hi, I Try To Run Locally

I don't thing WEB_HOST is important, but what about FILE_HOST?
do I need to change it accordingly?

2 years ago
0 Hi, I Try To Run Locally

I changed port here:

clearml-serving-inference:
    image: allegroai/clearml-serving-inference:latest
    container_name: clearml-serving-inference
    restart: unless-stopped
    ports:
      - "9090:8080"
2 years ago
0 Hey,

hi WickedElephant66
I have the same issue, but port is not the only problem
https://clearml.slack.com/archives/CTK20V944/p1656446563854059

2 years ago
0 Hey,

it worked for me with one docker-compose for all

2 years ago
0 Hi, I Try To Run Locally

would IP help instead?

2 years ago
0 Hi, I Try To Run Locally

it suppose to have access_key and secret_key which should correspond to this file

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

How can I clean database or whatever to get to the beginning?

2 years ago
0 Hi, I Try To Run Locally

except access_key of course, they should be yours

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

clearml-serving-inference | 2022-07-03 22:06:26,893 - clearml.storage - ERROR - Could not download , err: HTTPConnectionPool(host='localhost', port=8081): Max retries exceeded with url: /DevOps/serving%20example%2010.0a76d264e30940c2b600375fa839f1a2/artifacts/py_code_test_model_pytorch2/preprocess.py (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fc3f41b1790>: Failed to establish a new connection: [Errno 111] Connection refused'))

2 years ago
0 Hi, I Try To Run Locally

yeah, I tried the following
None
but haven't managed yet to make it work

2 years ago
0 Hi, I Try To Run Locally

one docker-compose for all

2 years ago
2 years ago
0 Hi, I Try To Run Locally

the way above works for me

2 years ago
0 Hi, I Try To Run Locally

I tried that, it didn't work. I was confused by the separate port parameter:

CLEARML_SERVING_PORT: ${CLEARML_SERVING_PORT:-8080}

which is only one port related in docker-compose-triton.yml
Can I test /auth.login somehow independently? Using curl or any other way. Which address does it suppose to have and which creds should I use?

2 years ago
0 Hi, I Try To Run Locally

seems true

root@9f6a74ab9a27:~/clearml# curl 

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>405 Method Not Allowed</title>
<h1>Method Not Allowed</h1>
<p>The method is not allowed for the requested URL.</p>
root@9f6a74ab9a27:~/clearml# curl 

curl: (7) Failed to connect to localhost port 8081: Connection refused
root@9f6a74ab9a27:~/clearml# 
2 years ago
0 Hi, I Try To Run Locally

yeah, ok
but it didn't

2 years ago
0 Hi, I Try To Run Locally

I'm on Ubuntu

2 years ago
0 Hi, I Try To Run Locally

maybe I'm missing something with credentials?

2 years ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

I tried to switch off auto-refresh, but it doesn't help

2 years ago
0 Hi, I Try To Run Locally

same thing

clearml-serving-inference     | Retrying (Retry(total=236, connect=236, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f899dc4e8b0>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login
2 years ago
0 Hi, I Try To Run Locally

seems like an issue about 2 compose apps using different networks which are not accessible from each other
I wonder if I just need to join 2 docker-compose files to run everything in one session

2 years ago
0 Hi, I Try To Run Locally

will continue tomorrow

2 years ago
0 Hi, I Try To Run Locally

doesn't work anyway

2 years ago
Show more results compactanswers