Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Try To Run Locally

Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed clearml-serving-inference outbound port to 9090. But after that I get the following issue:

clearml-serving-triton        | Retrying (Retry(total=237, connect=237, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f02a2602250>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login

Is there any best practices for running both services locally? What kind of configuration I suppose to do?
I already tried to set ~/clearml.conf with access_key and provide it in example.env but it didn't help. Maybe Ido something wrong with host:port configurations. Thanks!

  
  
Posted 2 years ago
Votes Newest

Answers 60


same thing

clearml-serving-inference     | Retrying (Retry(total=236, connect=236, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f899dc4e8b0>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login
  
  
Posted 2 years ago

It throws the same error

  
  
Posted 2 years ago

I can make a PR if it works

  
  
Posted 2 years ago

Ignore the quotes I've tried with quotes itself first

  
  
Posted 2 years ago

Are you using native Linux? Or wsl

  
  
Posted 2 years ago

I should not edit anything in clearml.conf right?

  
  
Posted 2 years ago

would IP help instead?

  
  
Posted 2 years ago

that's strange, maybe you should upgrade it

  
  
Posted 2 years ago

maybe I'm missing something with credentials?

  
  
Posted 2 years ago

yeah, ok
but it didn't

  
  
Posted 2 years ago

When i run this it says can't run multi containers

  
  
Posted 2 years ago

I changed port here:

clearml-serving-inference:
    image: allegroai/clearml-serving-inference:latest
    container_name: clearml-serving-inference
    restart: unless-stopped
    ports:
      - "9090:8080"
  
  
Posted 2 years ago

No i use
docker compose instead of docker-compose

  
  
Posted 2 years ago

I don't thing WEB_HOST is important, but what about FILE_HOST?
do I need to change it accordingly?

  
  
Posted 2 years ago

do I need to change anything else?

  
  
Posted 2 years ago

my example.env

CLEARML_WEB_HOST="
"
CLEARML_API_HOST="
"
CLEARML_FILES_HOST="
"
CLEARML_API_ACCESS_KEY="91SFEX4BYUQ9YCZ9V6WP"
CLEARML_API_SECRET_KEY="4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
CLEARML_SERVING_TASK_ID="450231049bba42f69c6507cb774f7dc6 
  
  
Posted 2 years ago

Yep this is fine

  
  
Posted 2 years ago

I wonder if I just need to join 2 docker-compose files to run everything in one session

Actually that could also work

But for reference, when I said IP i meant the actual host network IP not the 127.0.0.1 (which is the same as localhost)

  
  
Posted 2 years ago

I got only smth like this:

clearml-serving-triton        | I0701 08:32:58.580705 46 server.cc:250] Waiting for in-flight requests to complete.
clearml-serving-triton        | I0701 08:32:58.580710 46 server.cc:266] Timeout 30: Found 0 model versions that have in-flight inferences
clearml-serving-triton        | I0701 08:32:58.580713 46 server.cc:281] All models are stopped, unloading models
clearml-serving-triton        | I0701 08:32:58.580717 46 server.cc:288] Timeout 30: Found 0 live models and 0 in-flight non-inference requ
  
  
Posted 2 years ago

like None ?

  
  
Posted 2 years ago

so it works with

  
  
Posted 2 years ago

What are you getting with:

curl http://<ip>:8008/auth.login
  
  
Posted 2 years ago

curl 

{"meta":{"id":"59bbb55b6ddc456092658ae588c9a436","trx":"59bbb55b6ddc456092658ae588c9a436","endpoint":{"name":"auth.login","requested_version":"2.18","actual_version":"1.0"},"result_code":401,"result_subcode":20,"result_msg":"Unauthorized (missing credentials)","error_stack":null,"error_data":{}},"data":{}}
  
  
Posted 2 years ago

@<1523706266315132928:profile|DefiantHippopotamus88> seems like you are missing the ports 🙂

CLEARML_WEB_HOST="
"
CLEARML_API_HOST="
"
CLEARML_FILES_HOST="
"
  
  
Posted 2 years ago

So edited ik accordance with yours

  
  
Posted 2 years ago

you are right, for some reason it doesn't resolve inside a container

root@dd0252a8f93e:~/clearml# curl 

curl: (7) Failed to connect to localhost port 8008: Connection refused
root@dd0252a8f93e:~/clearml# curl 

curl: (7) Failed to connect to 127.0.0.1 port 8008: Connection refused
root@dd0252a8f93e:~/clearml# 
  
  
Posted 2 years ago

Nope seems like a docker-compose issue

  
  
Posted 2 years ago

but it actually looks ok

  
  
Posted 2 years ago

I tried that, it didn't work. I was confused by the separate port parameter:

CLEARML_SERVING_PORT: ${CLEARML_SERVING_PORT:-8080}

which is only one port related in docker-compose-triton.yml
Can I test /auth.login somehow independently? Using curl or any other way. Which address does it suppose to have and which creds should I use?

  
  
Posted 2 years ago

how d you start a docker-compose?

docker-compose --env-file example.env -f docker-compose.yml up -d
  
  
Posted 2 years ago