Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Try To Run Locally

Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed clearml-serving-inference outbound port to 9090. But after that I get the following issue:

clearml-serving-triton        | Retrying (Retry(total=237, connect=237, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f02a2602250>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login

Is there any best practices for running both services locally? What kind of configuration I suppose to do?
I already tried to set ~/clearml.conf with access_key and provide it in example.env but it didn't help. Maybe Ido something wrong with host:port configurations. Thanks!

  
  
Posted 2 years ago
Votes Newest

Answers 60


seems true

root@9f6a74ab9a27:~/clearml# curl 

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>405 Method Not Allowed</title>
<h1>Method Not Allowed</h1>
<p>The method is not allowed for the requested URL.</p>
root@9f6a74ab9a27:~/clearml# curl 

curl: (7) Failed to connect to localhost port 8081: Connection refused
root@9f6a74ab9a27:~/clearml# 
  
  
Posted 2 years ago

like None ?

  
  
Posted 2 years ago

I don't thing WEB_HOST is important, but what about FILE_HOST?
do I need to change it accordingly?

  
  
Posted 2 years ago

one docker-compose for all

  
  
Posted 2 years ago

so it works with

  
  
Posted 2 years ago

will continue tomorrow

  
  
Posted 2 years ago

yeah, I tried the following
None
but haven't managed yet to make it work

  
  
Posted 2 years ago

I wonder if I just need to join 2 docker-compose files to run everything in one session

Actually that could also work

But for reference, when I said IP i meant the actual host network IP not the 127.0.0.1 (which is the same as localhost)

  
  
Posted 2 years ago

seems like an issue about 2 compose apps using different networks which are not accessible from each other
I wonder if I just need to join 2 docker-compose files to run everything in one session

  
  
Posted 2 years ago

you are right, for some reason it doesn't resolve inside a container

root@dd0252a8f93e:~/clearml# curl 

curl: (7) Failed to connect to localhost port 8008: Connection refused
root@dd0252a8f93e:~/clearml# curl 

curl: (7) Failed to connect to 127.0.0.1 port 8008: Connection refused
root@dd0252a8f93e:~/clearml# 
  
  
Posted 2 years ago

would IP help instead?

  
  
Posted 2 years ago

same thing

clearml-serving-inference     | Retrying (Retry(total=236, connect=236, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f899dc4e8b0>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login
  
  
Posted 2 years ago

Try the following example.env :

CLEARML_SERVING_PORT=9090
CLEARML_WEB_HOST="http://<IP>:8080"
CLEARML_API_HOST="http://<IP>:8008"
CLEARML_FILES_HOST="http://<IP>:8081"

(I think the localhost is resolved to inside the container and not the host machine, hence the error)

  
  
Posted 2 years ago

I have to step away for a couple of hours
please let me know if you find something wrong

  
  
Posted 2 years ago

my clearml.conf

api { 
    web_server: 

    api_server: 

    files_server: 

    # test 3
    credentials {
        "access_key" = "91SFEX4BYUQ9YCZ9V6WP"
        "secret_key" = "4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
    }
}
  
  
Posted 2 years ago

maybe I'm missing something with credentials?

  
  
Posted 2 years ago

doesn't work anyway

  
  
Posted 2 years ago

oh, I see one error, let me check fast

  
  
Posted 2 years ago

serving

  
  
Posted 2 years ago

server

  
  
Posted 2 years ago

Okay this seems correct...
Can you share both yaml files (server & serving) and env file?

  
  
Posted 2 years ago

curl 

{"meta":{"id":"59bbb55b6ddc456092658ae588c9a436","trx":"59bbb55b6ddc456092658ae588c9a436","endpoint":{"name":"auth.login","requested_version":"2.18","actual_version":"1.0"},"result_code":401,"result_subcode":20,"result_msg":"Unauthorized (missing credentials)","error_stack":null,"error_data":{}},"data":{}}
  
  
Posted 2 years ago

What are you getting with:

curl http://<ip>:8008/auth.login
  
  
Posted 2 years ago

I tried that, it didn't work. I was confused by the separate port parameter:

CLEARML_SERVING_PORT: ${CLEARML_SERVING_PORT:-8080}

which is only one port related in docker-compose-triton.yml
Can I test /auth.login somehow independently? Using curl or any other way. Which address does it suppose to have and which creds should I use?

  
  
Posted 2 years ago

@<1523706266315132928:profile|DefiantHippopotamus88> seems like you are missing the ports 🙂

CLEARML_WEB_HOST="
"
CLEARML_API_HOST="
"
CLEARML_FILES_HOST="
"
  
  
Posted 2 years ago

do I need to change anything else?

  
  
Posted 2 years ago

and my ~/clearml.conf

api { 
    web_server: 

    api_server: 

    files_server: 

    # test 3
    credentials {
        "access_key" = "91SFEX4BYUQ9YCZ9V6WP"
        "secret_key" = "4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
    }
} 
  
  
Posted 2 years ago

my example.env

CLEARML_WEB_HOST="
"
CLEARML_API_HOST="
"
CLEARML_FILES_HOST="
"
CLEARML_API_ACCESS_KEY="91SFEX4BYUQ9YCZ9V6WP"
CLEARML_API_SECRET_KEY="4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
CLEARML_SERVING_TASK_ID="450231049bba42f69c6507cb774f7dc6 
  
  
Posted 2 years ago

I changed port here:

clearml-serving-inference:
    image: allegroai/clearml-serving-inference:latest
    container_name: clearml-serving-inference
    restart: unless-stopped
    ports:
      - "9090:8080"
  
  
Posted 2 years ago

Hi @<1523706266315132928:profile|DefiantHippopotamus88>
The idea is that clearml-server acts as a control plane and can sit on a different machine, obviously you can run both on the same machine for testing. Specifically it looks like the clearml-sering is not configured correctly as the error points to issue with initial handshake/login between the triton containers and the clearml-server. How did you configure the clearml-serving docker compose?

  
  
Posted 2 years ago