Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Unanswered
Hi There, Another Triton-Related Question: Are We Able To Deploy


Hi there!

Technically there should be nothing stopping you from deploying a python backend model. I just checked the source code and ClearML basically just downloads the model artifact and renames it based on the inferred type of model.

None

As far as I'm aware (could def be wrong here!), the Triton Python backend essentially requires a folder containing e.g. a model.py file. I propose the following steps:

  • Given the code above, if you package the model.py file as a folder in clearml, clearml-serving will detect this and simply extract the folder in the right place for you. Then you have to adjust the config.pbtxt using the command line arguments to properly load the python file
  • If this does not work, an extra ifelse check should be added in the code above, also checking for "python" in the framework, similar to e.g. pytorch or onnx
  • However it is done, once the python file is in the right position and the config.pbtxt is properly setup, triton should just take it from there and everything should work as expected
    Could you try this approach? If this works, it would be an interesting example to add to the repo! Thanks 😄
  
  
Posted one year ago
163 Views
0 Answers
one year ago
one year ago