Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Try To Run Locally

Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed clearml-serving-inference outbound port to 9090. But after that I get the following issue:

clearml-serving-triton        | Retrying (Retry(total=237, connect=237, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f02a2602250>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login

Is there any best practices for running both services locally? What kind of configuration I suppose to do?
I already tried to set ~/clearml.conf with access_key and provide it in example.env but it didn't help. Maybe Ido something wrong with host:port configurations. Thanks!

  
  
Posted 2 years ago
Votes Newest

Answers 60


you should also use my example.env

  
  
Posted 2 years ago

Hi @<1523706266315132928:profile|DefiantHippopotamus88>
The idea is that clearml-server acts as a control plane and can sit on a different machine, obviously you can run both on the same machine for testing. Specifically it looks like the clearml-sering is not configured correctly as the error points to issue with initial handshake/login between the triton containers and the clearml-server. How did you configure the clearml-serving docker compose?

  
  
Posted 2 years ago

do I need to change anything else?

  
  
Posted 2 years ago

same thing

clearml-serving-inference     | Retrying (Retry(total=236, connect=236, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f899dc4e8b0>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login
  
  
Posted 2 years ago

would IP help instead?

  
  
Posted 2 years ago

I tried that, it didn't work. I was confused by the separate port parameter:

CLEARML_SERVING_PORT: ${CLEARML_SERVING_PORT:-8080}

which is only one port related in docker-compose-triton.yml
Can I test /auth.login somehow independently? Using curl or any other way. Which address does it suppose to have and which creds should I use?

  
  
Posted 2 years ago

I got only smth like this:

clearml-serving-triton        | I0701 08:32:58.580705 46 server.cc:250] Waiting for in-flight requests to complete.
clearml-serving-triton        | I0701 08:32:58.580710 46 server.cc:266] Timeout 30: Found 0 model versions that have in-flight inferences
clearml-serving-triton        | I0701 08:32:58.580713 46 server.cc:281] All models are stopped, unloading models
clearml-serving-triton        | I0701 08:32:58.580717 46 server.cc:288] Timeout 30: Found 0 live models and 0 in-flight non-inference requ
  
  
Posted 2 years ago

so it works with

  
  
Posted 2 years ago

What are you getting with:

curl http://<ip>:8008/auth.login
  
  
Posted 2 years ago

server

  
  
Posted 2 years ago

that's strange, maybe you should upgrade it

  
  
Posted 2 years ago

Did you get the same as well?

  
  
Posted 2 years ago

It throws the same error

  
  
Posted 2 years ago

serving

  
  
Posted 2 years ago

Nope seems like a docker-compose issue

  
  
Posted 2 years ago

but it actually looks ok

  
  
Posted 2 years ago

how d you start a docker-compose?

docker-compose --env-file example.env -f docker-compose.yml up -d
  
  
Posted 2 years ago

I have to step away for a couple of hours
please let me know if you find something wrong

  
  
Posted 2 years ago

image

  
  
Posted 2 years ago

It should also work with host IP and two docker compose files.
I'm not sure where to push a for a unified docker compose?

  
  
Posted 2 years ago

the way above works for me

  
  
Posted 2 years ago

doesn't work anyway

  
  
Posted 2 years ago

oh, I see one error, let me check fast

  
  
Posted 2 years ago

and my ~/clearml.conf

api { 
    web_server: 

    api_server: 

    files_server: 

    # test 3
    credentials {
        "access_key" = "91SFEX4BYUQ9YCZ9V6WP"
        "secret_key" = "4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
    }
} 
  
  
Posted 2 years ago

But I'm getting a timeout issue, when i docker-compose up 😢

  
  
Posted 2 years ago

my example.env

CLEARML_WEB_HOST="
"
CLEARML_API_HOST="
"
CLEARML_FILES_HOST="
"
CLEARML_API_ACCESS_KEY="91SFEX4BYUQ9YCZ9V6WP"
CLEARML_API_SECRET_KEY="4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
CLEARML_SERVING_TASK_ID="450231049bba42f69c6507cb774f7dc6 
  
  
Posted 2 years ago

Hey i tried your docker-compose
After all the initial setup, clearml-serving-triton
clearml-serving-statistics
clearml-serving-inference, throw read time out error?

  
  
Posted 2 years ago

my clearml.conf

api { 
    web_server: 

    api_server: 

    files_server: 

    # test 3
    credentials {
        "access_key" = "91SFEX4BYUQ9YCZ9V6WP"
        "secret_key" = "4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
    }
}
  
  
Posted 2 years ago

No i use
docker compose instead of docker-compose

  
  
Posted 2 years ago

I wonder if I just need to join 2 docker-compose files to run everything in one session

Actually that could also work

But for reference, when I said IP i meant the actual host network IP not the 127.0.0.1 (which is the same as localhost)

  
  
Posted 2 years ago