Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
DefiantHippopotamus88
Moderator
2 Questions, 55 Answers
  Active since 10 January 2023
  Last activity 25 days ago

Reputation

0

Badges 1

51 × Eureka!
0 Votes
60 Answers
425 Views
0 Votes 60 Answers 425 Views
Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed cl...
one year ago
0 Votes
18 Answers
515 Views
0 Votes 18 Answers 515 Views
Is it possible to create a serving endpoint with Pytorch JIT file in web interface only?
one year ago
0 Hi, I Try To Run Locally

I don't thing WEB_HOST is important, but what about FILE_HOST?
do I need to change it accordingly?

one year ago
0 Hi, I Try To Run Locally

I have to step away for a couple of hours
please let me know if you find something wrong

one year ago
0 Hi, I Try To Run Locally

the way above works for me

one year ago
0 Hi, I Try To Run Locally

I tried that, it didn't work. I was confused by the separate port parameter:

CLEARML_SERVING_PORT: ${CLEARML_SERVING_PORT:-8080}

which is only one port related in docker-compose-triton.yml
Can I test /auth.login somehow independently? Using curl or any other way. Which address does it suppose to have and which creds should I use?

one year ago
0 Hi, I Try To Run Locally

can you share your log items?

one year ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

curl -X POST " " -H "accept: application/json" -H "Content-Type: application/json" -d '{"url": " "}' curl: (56) Recv failure: Connection reset by peer

one year ago
0 Hi, I Try To Run Locally

seems true

root@9f6a74ab9a27:~/clearml# curl 

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>405 Method Not Allowed</title>
<h1>Method Not Allowed</h1>
<p>The method is not allowed for the requested URL.</p>
root@9f6a74ab9a27:~/clearml# curl 

curl: (7) Failed to connect to localhost port 8081: Connection refused
root@9f6a74ab9a27:~/clearml# 
one year ago
0 Hi, I Try To Run Locally

I'm on Ubuntu

one year ago
0 Hi, I Try To Run Locally

you should also use my example.env

one year ago
0 Hi, I Try To Run Locally

except access_key of course, they should be yours

one year ago
0 Hi, I Try To Run Locally

it suppose to have access_key and secret_key which should correspond to this file

one year ago
0 Hi, I Try To Run Locally

yeah, ok
but it didn't

one year ago
0 Hi, I Try To Run Locally

I can make a PR if it works

one year ago
0 Hi, I Try To Run Locally

I got only smth like this:

clearml-serving-triton        | I0701 08:32:58.580705 46 server.cc:250] Waiting for in-flight requests to complete.
clearml-serving-triton        | I0701 08:32:58.580710 46 server.cc:266] Timeout 30: Found 0 model versions that have in-flight inferences
clearml-serving-triton        | I0701 08:32:58.580713 46 server.cc:281] All models are stopped, unloading models
clearml-serving-triton        | I0701 08:32:58.580717 46 server.cc:288] Timeout 30: Found 0 live ...
one year ago
0 Hi, I Try To Run Locally

I haven't followed it so closely, but let me check

one year ago
0 Hi, I Try To Run Locally

but it actually looks ok

one year ago
0 Hi, I Try To Run Locally

how d you start a docker-compose?

docker-compose --env-file example.env -f docker-compose.yml up -d
one year ago
0 Hi, I Try To Run Locally

does it work for you?

one year ago
0 Hi, I Try To Run Locally

that's strange, maybe you should upgrade it

one year ago
one year ago
0 Hi, I Try To Run Locally

and my ~/clearml.conf

api { 
    web_server: 

    api_server: 

    files_server: 

    # test 3
    credentials {
        "access_key" = "91SFEX4BYUQ9YCZ9V6WP"
        "secret_key" = "4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
    }
} 
one year ago
0 Hey,

hi WickedElephant66
I have the same issue, but port is not the only problem
https://clearml.slack.com/archives/CTK20V944/p1656446563854059

one year ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

In my understanding requests still go through clearml-server which configuration I left intact. Maybe due to the port change in clearml-serving I need to adjust smth.

one year ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

I made it working with full port reassignment to 9090 in clearml-serving-inference
which still send me an error that the format of my request is somehow wrong
but then I started from scratch by creating completely new project and new endpoint

one year ago
0 Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?

curl -X POST " " -H "accept: application/json" -H "Content-Type: application/json" -d '{"url": " "}' {"detail":"Error processing request: Error: Failed loading preprocess code for 'py_code_test_model_pytorch2': 'NoneType' object has no attribute 'loader'"}

one year ago
Show more results compactanswers