Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ApprehensiveRaven81
Moderator
7 Questions, 15 Answers
  Active since 16 June 2023
  Last activity 2 years ago

Reputation

0

Badges 1

15 × Eureka!
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
hi guys, from the documentation , it says that clearml-serving "Support Deep Learning Models (TensorFlow, PyTorch, ONNX)". so I wonder if it does support Pyt...
2 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
also, does you guys support user authentication? in the documentation - Configuring ClearML Server | ClearML , it says that i have to modify the config file ...
2 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
hi guys, i wonder if clearml-serving support configuring the maximum number of requests for each user
2 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
hi guys, i want to add a type of message broker like Kafka to put incoming requests in a queue and monitor them. how can I integrate that with clearml?
2 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
hi guys, i have a question regarding clearml-serving. i have deployed my model to an API, now I want to add a front end interface for the URL, how should I g...
2 years ago
0 Votes
4 Answers
2K Views
0 Votes 4 Answers 2K Views
2 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
hi everyone, if I use the free hosted server of ClearML, does it mean that I should get a public URL to run inference on my model when I deploy to this server?
2 years ago
0 Hi Guys, I Want To Add A Type Of Message Broker Like Kafka To Put Incoming Requests In A Queue And Monitor Them. How Can I Integrate That With Clearml?

sorry the question is a bit vague. i just want to know if clearml already intergrated kafka, or do i have to implement it myself.

2 years ago
0 Hi Guys, I Have A Question Regarding Clearml-Serving. I Have Deployed My Model To An Api, Now I Want To Add A Front End Interface For The Url, How Should I Go About Doing It?

@<1523701070390366208:profile|CostlyOstrich36> how should i implement my own frontend? i mean, if was using fastapi, I can imagine coding HTML files and then link them to the specific URL endpoint, but with ClearML, I don't know where should I put the code for my front-end.

2 years ago
0 Hi Guys, I Have A Question Regarding Clearml-Serving. I Have Deployed My Model To An Api, Now I Want To Add A Front End Interface For The Url, How Should I Go About Doing It?

another problem is that i just want to use clearml-serving to serve an already trained model, the training process is not tracked my clearml, meaning the model is not registered on the models tab. is there any way to use clearml-serving to serve a model that is not tracked by clearml? @<1523701070390366208:profile|CostlyOstrich36>

2 years ago
0 Hi Guys, I Have A Question Regarding Clearml-Serving. I Have Deployed My Model To An Api, Now I Want To Add A Front End Interface For The Url, How Should I Go About Doing It?

i'm trying to build an image segmentation tool, so I expect that the front end will allow users to upload images, get their segmented images & option to annotate the images if the results are not good enough

2 years ago
0 Hi Guys, I Have A Question Regarding Clearml-Serving. I Have Deployed My Model To An Api, Now I Want To Add A Front End Interface For The Url, How Should I Go About Doing It?

using the API interface, users should be able to upload an image for the model to run inference on and get the result image

2 years ago
0 Hi Everyone, If I Use The Free Hosted Server Of Clearml, Does It Mean That I Should Get A Public Url To Run Inference On My Model When I Deploy To This Server?

@<1523701070390366208:profile|CostlyOstrich36> ClearML offeres a free tier server, right? My question is

  • Can I deploy to this server? I.e use hardware from this server instead of from my machine.
  • If so, when i do deploy on ClearML server, how can i get a public url to run inference?
2 years ago
0 Hi Guys, I Want To Add A Type Of Message Broker Like Kafka To Put Incoming Requests In A Queue And Monitor Them. How Can I Integrate That With Clearml?

i see the architecture map for clearml-serving have kafka part, and when i run an example following the readme file, i can also see a kafka container running on my machine, but i couldn't find instruction to access that service, while you guys have instruction for using other services, such as prothemeus and grafana
image

2 years ago
0 Hi Guys, I Want To Add A Type Of Message Broker Like Kafka To Put Incoming Requests In A Queue And Monitor Them. How Can I Integrate That With Clearml?

I want to set up a queue for requests, incoming request will first go to this queue and we can assign which request goes to which worker, and also respond the status of each request to the clients: in queue, being processed, completed, etc.

2 years ago