Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ComfortableShark77
Moderator
4 Questions, 17 Answers
  Active since 10 January 2023
  Last activity 8 months ago

Reputation

0

Badges 1

17 × Eureka!
0 Votes
4 Answers
1K Views
0 Votes 4 Answers 1K Views
Hello everyone.I have no idea why clearml-serving inference server tries to get model from that url(pic 1), but in ClearML UI i have correct url(pic 2). Coul...
2 years ago
0 Votes
20 Answers
1K Views
0 Votes 20 Answers 1K Views
2 years ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
And one more question. How can i get loaded model in Preporcess class in ClearML Serving?
2 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
Hello everyone! Can i hide some fields of my config file for task in UI. I log file with task.connect_configuration("config.yml") , and my file looks like that
2 years ago
0 Hello Everyone.I Have No Idea Why Clearml-Serving Inference Server Tries To Get Model From That Url(Pic 1), But In Clearml Ui I Have Correct Url(Pic 2). Could You Help Me With This?

clearml-serving --id my_service_id model add --engine triton --endpoint "test_ocr_model" --preprocess "preprocess.py" --name "test-model" --project "clear-ml-test-serving-model" --input-size 1 3 384 384 --input-name "INPUT__0" --input-type float32 --output-size 1 -1 --output-name "OUTPUT__0" --output-type int32

2 years ago
0 And One More Question. How Can I Get Loaded Model In Preporcess Class In Clearml Serving?

AgitatedDove14 My model has method generate. i would like to call it. How can i get loaded automaticly model from Preprocess object. Preprocess file
` from typing import Any, Callable, Optional

from transformers import TrOCRProcessor
import numpy as np

Notice Preprocess class Must be named "Preprocess"

class Preprocess(object):

def __init__(self):
    self.processor = TrOCRProcessor.from_pretrained("microsoft/trocr-small-printed")      

def preprocess(self, body: dict, sta...
2 years ago
0 And One More Question. How Can I Get Loaded Model In Preporcess Class In Clearml Serving?

AgitatedDove14 I need to call generate method of my model, but by default it calls forward

2 years ago
0 Hello Everyone. I Don'T Uderstand Why Is My Training Slower With Connected Tensorboard Than Without It. I Have Some Thoughts About It But I Not Sure. My Internet Traffic Looks Wierd.I Think This Is Because Tensorboard Logs Too Much Data On Each Batch And

frameworks = { 'tensorboard': False, 'pytorch': False } task = Task.init( project_name="train_pipeline", task_name="test_train_python", task_type=TaskTypes.training, auto_connect_frameworks=frameworks )

2 years ago