Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Unanswered
Hi, I Am Creating Pipeline From Function With Dynamically Created Steps, Eg. If I Pass Pipeline Param Tune_Optime='Recall,Precision', My Pipeline Is Creating 2 Tasks/Steps - Each For Trained Model. Everything Is Working Really Nice, When I Start Pipeline


Ad1. yes, think this is kind of bug. Using _task to get pipeline input values is a little bit ugly

Good point, let;s fix it 🙂

new pipeline is built from scratch (all steps etc), but by clicking "NEW RUN" in GUI it just reuse existing pipeline. Is it correct?

Oh I think I understand what happens, the way the pipeline logic is built, is that the "DAG" is created the first time the code runs, then when you re-run the pipeline step it serializes the DAG from the Task/backend.
The initial thinking is that, well we want to sometime in the future allow you to easily edit the DAG in the UI, hence the behavior.
But, specifically here, we want the opposite.
As a temp hack you can add the following:
` print(pipe._task.get_parameters_as_dict())

clear the stored DAG

pipe._task.set_configuration_object(name=pipe._config_section, config_text="") `

  
  
Posted 2 years ago
145 Views
0 Answers
2 years ago
one year ago