Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Unanswered
Hi There! Can Anybody Help Me With Specifying The 'Platform' For A Model In Clearml-Serving. I Am Using The K8S Clearml-Serving Setup (Version 1.3.1). I Already Tried A Bunch Of Variants Like


Hi @<1523701205467926528:profile|AgitatedDove14> , now there are some interesting things happening: Like I wrote before I got the error message but one minute later the model was added successfully nonetheless. The log says

E0603 09:43:01.652550 41 model_repository_manager.cc:996] Poll failed for model directory 'test_model_pytorch': Invalid model name: Could not determine backend for model 'test_model_pytorch' with no backend in model configuration. Expected model name of the form 'model.<backend_name>'.
I0603 09:44:01.654376 41 model_lifecycle.cc:459] loading: test_model_pytorch:1
I0603 09:44:02.619246 41 libtorch.cc:1983] TRITONBACKEND_Initialize: pytorch
I0603 09:44:02.619271 41 libtorch.cc:1993] Triton TRITONBACKEND API version: 1.10
I0603 09:44:02.619278 41 libtorch.cc:1999] 'pytorch' TRITONBACKEND API version: 1.10
I0603 09:44:02.619304 41 libtorch.cc:2032] TRITONBACKEND_ModelInitialize: test_model_pytorch (version 1)
W0603 09:44:02.619939 41 libtorch.cc:284] skipping model configuration auto-complete for 'test_model_pytorch': not supported for pytorch backend
I0603 09:44:02.620389 41 libtorch.cc:313] Optimized execution is enabled for model instance 'test_model_pytorch'
I0603 09:44:02.620404 41 libtorch.cc:332] Cache Cleaning is disabled for model instance 'test_model_pytorch'
I0603 09:44:02.620411 41 libtorch.cc:349] Inference Mode is disabled for model instance 'test_model_pytorch'
I0603 09:44:02.620418 41 libtorch.cc:444] NvFuser is not specified for model instance 'test_model_pytorch'
I0603 09:44:02.620474 41 libtorch.cc:2076] TRITONBACKEND_ModelInstanceInitialize: test_model_pytorch (CPU device 0)
I0603 09:44:02.665851 41 model_lifecycle.cc:693] successfully loaded 'test_model_pytorch' version 1

So why is it that for the models I try to register no loading process is started?

  
  
Posted 3 months ago
45 Views
0 Answers
3 months ago
3 months ago