Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
JealousArcticwolf24
Moderator
3 Questions, 14 Answers
  Active since 03 May 2024
  Last activity 6 months ago

Reputation

0

Badges 1

14 × Eureka!
0 Votes
17 Answers
637 Views
0 Votes 17 Answers 637 Views
7 months ago
0 Votes
4 Answers
674 Views
0 Votes 4 Answers 674 Views
Hi! I have question about data managment part of ClearML. Does ClearML support data versioning like in LakeFS ?) Is it similar ? maybe there is some interest...
7 months ago
0 Votes
3 Answers
678 Views
0 Votes 3 Answers 678 Views
6 months ago
0 Hi! I Have Question About Data Managment Part Of Clearml. Does Clearml Support Data Versioning Like In Lakefs ?) Is It Similar ? Maybe There Is Some Interesting Pros And Cons?

Hi @<1523701205467926528:profile|AgitatedDove14>
Actually trying to understand) does clearml infrastructure better than common popular stack like dvc/lakefs + mlflow + cubeflow/airflow)

7 months ago
0 Hello! I’M Currently Using Clearml-Server As An Artifact Manager And Clearml-Serving For Model Inference, With Each Running On Separate Hosts Using Docker Compose. I’Ve Successfully Deployed A Real-Time Inference Model In Clearml-Serving, Configured Withi

@<1523701205467926528:profile|AgitatedDove14> Thank you for the answer! So i will be able to log everything in the same grafana ? and i dont need to run another docker-compose with new clearml inference ?)

7 months ago
0 Hello! I’M Currently Using Clearml-Server As An Artifact Manager And Clearml-Serving For Model Inference, With Each Running On Separate Hosts Using Docker Compose. I’Ve Successfully Deployed A Real-Time Inference Model In Clearml-Serving, Configured Withi

@<1523701205467926528:profile|AgitatedDove14> Looks like it’s not so easy((( i run model in independent container… but cant find metrics in grafana( should i add this service into docker-compose ? or what? if i should add it into docker-compose how can i add new models without rebuild whole docker compose ? or i just need to add configs into env of my dockerfile ?
image
![image](https://clearml-web-...

7 months ago
0 Hello! I’M Currently Using Clearml-Server As An Artifact Manager And Clearml-Serving For Model Inference, With Each Running On Separate Hosts Using Docker Compose. I’Ve Successfully Deployed A Real-Time Inference Model In Clearml-Serving, Configured Withi

@<1523701205467926528:profile|AgitatedDove14> thanks) actualy im already solved the problem) just build another docker-compose (only with clearml-inference) with external network) conected to first docker-compose) where im running whole bunch of containers with grafana and prometeus) and now im able to use 2 diff envs in 2 diff containers) but logging in the same prometeus and grafana)
image

6 months ago
0 Hi! Maybe Someone Will Be Able To Help Me. Im Trying To Build System: Step 0: Build Pipeline (Run Locally) Step 1: Executes The Taskscheduler, Step 2: Executes The Pipeline (Remotely) I Tried To Use Agent Services As On Image. But It Doesnt Work ( I’M Ab

@<1523701205467926528:profile|AgitatedDove14> Hi! Once again)
No( im able to run pipeline only with method .run_locally()(
I can start the task but every time get an error(

6 months ago