Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I Try To Run Locally

Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed clearml-serving-inference outbound port to 9090. But after that I get the following issue:

clearml-serving-triton        | Retrying (Retry(total=237, connect=237, read=240, redirect=240, status=240)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f02a2602250>: Failed to establish a new connection: [Errno 111] Connection refused')': /auth.login

Is there any best practices for running both services locally? What kind of configuration I suppose to do?
I already tried to set ~/clearml.conf with access_key and provide it in example.env but it didn't help. Maybe Ido something wrong with host:port configurations. Thanks!

  
  
Posted one year ago
Votes Newest

Answers 60


that's strange, maybe you should upgrade it

  
  
Posted one year ago

When i run this it says can't run multi containers

  
  
Posted one year ago

Nope seems like a docker-compose issue

  
  
Posted one year ago

does it work for you?

  
  
Posted one year ago

the way above works for me

  
  
Posted one year ago

No i use
docker compose instead of docker-compose

  
  
Posted one year ago

how d you start a docker-compose?

docker-compose --env-file example.env -f docker-compose.yml up -d
  
  
Posted one year ago

Ignore the quotes I've tried with quotes itself first

  
  
Posted one year ago

It throws the same error

  
  
Posted one year ago

image

  
  
Posted one year ago

Now when i docker compose again

  
  
Posted one year ago

So edited ik accordance with yours

  
  
Posted one year ago

it suppose to have access_key and secret_key which should correspond to this file

  
  
Posted one year ago

I should not edit anything in clearml.conf right?

  
  
Posted one year ago

except access_key of course, they should be yours

  
  
Posted one year ago

you should also use my example.env

  
  
Posted one year ago

image

  
  
Posted one year ago

can you share your log items?

  
  
Posted one year ago

I'm on Ubuntu

  
  
Posted one year ago

Are you using native Linux? Or wsl

  
  
Posted one year ago

But I'm getting a timeout issue, when i docker-compose up 😢

  
  
Posted one year ago

Yep this is fine

  
  
Posted one year ago

but it actually looks ok

  
  
Posted one year ago

I got only smth like this:

clearml-serving-triton        | I0701 08:32:58.580705 46 server.cc:250] Waiting for in-flight requests to complete.
clearml-serving-triton        | I0701 08:32:58.580710 46 server.cc:266] Timeout 30: Found 0 model versions that have in-flight inferences
clearml-serving-triton        | I0701 08:32:58.580713 46 server.cc:281] All models are stopped, unloading models
clearml-serving-triton        | I0701 08:32:58.580717 46 server.cc:288] Timeout 30: Found 0 live models and 0 in-flight non-inference requ
  
  
Posted one year ago

I haven't followed it so closely, but let me check

  
  
Posted one year ago

Did you get the same as well?

  
  
Posted one year ago

Hey i tried your docker-compose
After all the initial setup, clearml-serving-triton
clearml-serving-statistics
clearml-serving-inference, throw read time out error?

  
  
Posted one year ago

yeah, ok
but it didn't

  
  
Posted one year ago

It should also work with host IP and two docker compose files.
I'm not sure where to push a for a unified docker compose?

  
  
Posted one year ago

I can make a PR if it works

  
  
Posted one year ago