Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
<no title>

  
  
Posted 2 years ago
Votes Newest

Answers 7


An example for something like spacy would be useful for the community.

That awesome, any chance you can PR something? (no need for it to be perfect, we can take it from there)

  
  
Posted 2 years ago

Thanks @<1523701205467926528:profile|AgitatedDove14> . For now i have forked the clearml-serving locally and added an engine for spacy . It is working fine. Yeah, i think some documentation and a good example would make it more visible. An example for something like spacy would be useful for the community.

  
  
Posted 2 years ago

Hi @<1523704207914307584:profile|ObedientToad56>

hat would be the right way to extend this with let's say a custom engine that is currently not supported ?

as you said 'custom' 🙂
None
This is actually a custom engine, (see (3) in the readme, and the preprocessing.py implementing it) I think we should actually add a specific example to custom so this is more visible. Any thoughts on what would be an easy one?

  
  
Posted 2 years ago

Hmm, thanks @<1523701087100473344:profile|SuccessfulKoala55> what would be the right way that you would recommend for adding support for other models/frameworks like spacy .

Would you recommend adding other models by sending PR in line with the lightgbm example here
None

or use the custom option and move the logic for loading the model to preprocess or process ?

  
  
Posted 2 years ago

So from what I see, the custom engine will basically call the preprocess() method defined in the Preprocess class you define

  
  
Posted 2 years ago

@<1523701087100473344:profile|SuccessfulKoala55> I saw in the examples one case of engine being passed as custom .

None

My requirement is the need for supporting let's say other frameworks like spacy . So I was thinking maybe i could create a pipeline that does the model load and inference and pass that pipeline. I am still figuring out the ecosystem, would something like that make sense?

  
  
Posted 2 years ago

Hi @<1523704207914307584:profile|ObedientToad56> , I would assume that will require an integration of the engine to the clearml-serving code (and a PR 🙂 )

  
  
Posted 2 years ago
449 Views
7 Answers
2 years ago
7 months ago
Tags
Similar posts