DefiantHippopotamus88
2
Questions,
55
Answers
Active since 10 January 2023
Last activity
8 months ago
Reputation
0
Badges 1
51 × Eureka!Is it possible to create a serving endpoint with Pytorch JIT file in web interface only?
2 years ago
Hi, I try to run locally clearml-server and clearml-serving to create inference endpoint that utilize Triton server. So far I had port issues so I changed cl...
2 years ago
0
Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?
I tried to switch off auto-refresh, but it doesn't help
2 years ago
0
Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?
I made it working with full port reassignment to 9090 in clearml-serving-inference
which still send me an error that the format of my request is somehow wrong
but then I started from scratch by creating completely new project and new endpoint
2 years ago
0
Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?
I don't know why it requests localhost
2 years ago
0
Is It Possible To Create A Serving Endpoint With Pytorch Jit File In Web Interface Only?
CLEARML_FILES_HOST="
"
2 years ago
0
Hi, I Try To Run Locally
and my ~/clearml.conf
api {
web_server:
api_server:
files_server:
# test 3
credentials {
"access_key" = "91SFEX4BYUQ9YCZ9V6WP"
"secret_key" = "4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
}
}
2 years ago
2 years ago
2 years ago
2 years ago
0
Hi, I Try To Run Locally
seems true
root@9f6a74ab9a27:~/clearml# curl
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>405 Method Not Allowed</title>
<h1>Method Not Allowed</h1>
<p>The method is not allowed for the requested URL.</p>
root@9f6a74ab9a27:~/clearml# curl
curl: (7) Failed to connect to localhost port 8081: Connection refused
root@9f6a74ab9a27:~/clearml#
2 years ago
0
Hi, I Try To Run Locally
how d you start a docker-compose?
docker-compose --env-file example.env -f docker-compose.yml up -d
2 years ago
2 years ago
0
Hi, I Try To Run Locally
curl
{"meta":{"id":"59bbb55b6ddc456092658ae588c9a436","trx":"59bbb55b6ddc456092658ae588c9a436","endpoint":{"name":"auth.login","requested_version":"2.18","actual_version":"1.0"},"result_code":401,"result_subcode":20,"result_msg":"Unauthorized (missing credentials)","error_stack":null,"error_data":{}},"data":{}}
2 years ago
2 years ago
2 years ago
2 years ago
0
Hi, I Try To Run Locally
I got only smth like this:
clearml-serving-triton | I0701 08:32:58.580705 46 server.cc:250] Waiting for in-flight requests to complete.
clearml-serving-triton | I0701 08:32:58.580710 46 server.cc:266] Timeout 30: Found 0 model versions that have in-flight inferences
clearml-serving-triton | I0701 08:32:58.580713 46 server.cc:281] All models are stopped, unloading models
clearml-serving-triton | I0701 08:32:58.580717 46 server.cc:288] Timeout 30: Found 0 live ...
2 years ago
2 years ago
0
Hi, I Try To Run Locally
I have to step away for a couple of hours
please let me know if you find something wrong
2 years ago
2 years ago
0
Hi, I Try To Run Locally
my example.env
CLEARML_WEB_HOST="
"
CLEARML_API_HOST="
"
CLEARML_FILES_HOST="
"
CLEARML_API_ACCESS_KEY="91SFEX4BYUQ9YCZ9V6WP"
CLEARML_API_SECRET_KEY="4WTXT7tAW3R6tnSi8hzSKNjgkmgUoyv22lYT2FIzIfLoeGERRO"
CLEARML_SERVING_TASK_ID="450231049bba42f69c6507cb774f7dc6
2 years ago
2 years ago
2 years ago
2 years ago
0
Hey,
hi WickedElephant66
I have the same issue, but port is not the only problem
https://clearml.slack.com/archives/CTK20V944/p1656446563854059
2 years ago
2 years ago
Show more results
compactanswers