Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello I'M New Here, I Found This Error When Testing My Tensorflow / Keras Model. I Already Create The Model Endpoint By Running Command 'Clearml-Serving --Id <Service_Id> Model Add --Engine Triton --Endpoint "Model_Name"... '. Also My Tensorflow / Keras M

Hello I'm new here, i found this error when testing my tensorflow / keras model. I already create the model endpoint by running command 'clearml-serving --id <service_id> model add --engine triton --endpoint "model_name"... '. Also my tensorflow / keras model endpoint is reachable / connected. But for keras model example from clearml just works fine. Does anyone have the solution?
Here is the errors screenshot. Thank You :)

  
  
Posted one year ago
Votes Newest

Answers 7


NICE! MoodyCentipede68 this is awesome 🙂

  
  
Posted one year ago

MoodyCentipede68 from your log

clearml-serving-triton | E0620 03:08:27.822945 41 model_repository_manager.cc:1234] failed to load 'test_model_lstm2' version 1: Invalid argument: unexpected inference output 'dense', allowed outputs are: time_distributed

This seems the main issue of triton failing to.load
Does that make sense to you? how did you configure the endpoint model?

  
  
Posted one year ago

MoodyCentipede68 can you post the full docker-compose log (from spinning it until you get the error?)
You can just pipe the output to a file with :
docker-compose ... up > log.txt

  
  
Posted one year ago

Okay, here is the full docker-compose log AgitatedDove14 Thank You.

  
  
Posted one year ago

clearml-serving --id cd4c615583394719b9019667068954bd model add --engine triton --endpoint "test_model_lstm2" --preprocess "preprocess.py" --name "train lstmae model - serving_model" --project "serving examples" --input-size 1 60 1 --input-name "lstm_input" --input-type float32 --output-size -1 60 1 --output-name "dense" --output-type float32

This is how i add my model and set the endpoint, is it right? AgitatedDove14

  
  
Posted one year ago

GG AgitatedDove14 IT WORKS!!! I changed from "dense" to "time_distributed"

THANK YOU SO MUCH!!

  
  
Posted one year ago

Well from the error it seems there is no layer called "dense" , hence triton failing to find the layer returning the reult. Does that make sense?

  
  
Posted one year ago