Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Issue With Self-Hosted Clearml & Clearml-Serving: "Model Endpoints" Page Is Empty After Deploying An Endpoint.

Issue with self-hosted ClearML & clearml-serving: "Model Endpoints" page is empty after deploying an endpoint. Hey community!
I've run into an issue with my self-hosted ClearML with clearml-serving, deployed via docker-compose according to the documentation. It seems I've hit a wall after some deep diagnostics and I'm hoping for your advice.
TL;DR: The "Model Endpoints" page is empty because the new clearml-server reads endpoint statuses from Redis, but the old allegroai/clearml-serving-inference doesn't know how to report its status there.
My Setup:

  • Server: clearml/server:latest — the newest version.
  • Serving: allegroai/clearml-serving-inference:latest — the image hasn't been updated in ~16 months.
  • The deployment was done following the official docker-compose.yml examples from the documentation: None and NoneSymptoms & What I've Checked:
  • ✅ The clearml-agent is working, and the service task is launched.
  • ✅ The inference container responds to direct curl requests to the model.
  • ✅ An API call to tasks.get_by_id for this task successfully returns all the data (status: "in_progress" , etc.).
  • ❗ ️ An API call to serving.get_endpoints (which the UI uses) returns an empty array : {"data":{"endpoints":[]}} .The Cause (as I've discovered):
  • I analyzed the clearml-server code ( /apiserver/bll/serving/__init__.py ).
  • The logic for get_endpoints depends 100% on data in Redis (keys matching serving_container_* ).
  • I checked Redis with the command SCAN 0 MATCH "serving_container_..." — it's empty .My question for you: Has anyone else encountered a similar incompatibility issue? What is the current best practice for deploying clearml-serving alongside an up-to-date clearml-server ?
    It seems the public images for serving have fallen far behind the server. Is there any recommended path forward other than downgrading the server to version 0.17.5
  
  
Posted 2 months ago
Votes Newest

Answers


Hi @<1858319194597101568:profile|EagerKitten89> , from my understanding the endpoints page currently only supports LLM endpoints and clearml-serving will be supported in the next clearml-serving release.

  
  
Posted 2 months ago
677 Views
1 Answer
2 months ago
2 months ago
Tags