Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
CostlyElephant1
Moderator
6 Questions, 16 Answers
  Active since 27 January 2023
  Last activity 8 months ago

Reputation

0

Badges 1

16 × Eureka!
0 Votes
11 Answers
727 Views
0 Votes 11 Answers 727 Views
11 months ago
0 Votes
4 Answers
863 Views
0 Votes 4 Answers 863 Views
9 months ago
0 Votes
4 Answers
756 Views
0 Votes 4 Answers 756 Views
11 months ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
one year ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
one year ago
0 Votes
3 Answers
856 Views
0 Votes 3 Answers 856 Views
one year ago
0 Hello Clearml Community! I Have A Question Regarding Batch Inference Using A Base Docker Image In Clearml. I Have A Docker Image That Is Configured With The Necessary Environment And Code For The Task At Hand. My Goal Is To Generate And Enqueue A Task Uti

Hi again @<1523701070390366208:profile|CostlyOstrich36>
One further question. Is it possible also to avoid using a git project ? (given that the code could be already present inside the docker container).
hat would be the benefits of setting it in the clearml task config instead? thanks again!

11 months ago
0 Hi Again! I Am Doing Batch Inference From A Parent Task (That Defines A Base Docker Image). However, I'Ve Encountered An Issue Where The Task Takes Several Minutes (Approximately 3-5 Minutes) Specifically When It Reaches The Stage Of "Environment Setup Co

Hi @<1523701205467926528:profile|AgitatedDove14>
Yes, it was indeed in our code! after looking in depth, the loading of .cu and .cpp files was the root of the issue, slowing down the batch inference. Thanks a lot for your support!!

8 months ago
0 Hi There! I Had A Question Regarding Batch Inference With Clearml. I Would Like To Serve A Model Using An Inference Task (Containing The Model And The Code To Perform The Inference) As A Base To Be Cloned And Edited (Change Input Arguments), And Queue To

Hi Damjan, thank you for your message.
But If I understand correctly, that doc would be great for online serving. I am looking a solution for batch inference instead.

one year ago
0 Hi Again! I Am Doing Batch Inference From A Parent Task (That Defines A Base Docker Image). However, I'Ve Encountered An Issue Where The Task Takes Several Minutes (Approximately 3-5 Minutes) Specifically When It Reaches The Stage Of "Environment Setup Co

Also, as can be seen in docker args, I tried using CLEARML_AGENT_SKIP_PIP_VENV_INSTALL and CLEARML_AGENT_SKIP_PYTHON_ENV_INSTALL, to avoid installing packages as the container conatins everything needed to run the task, but not sure that it had any effect.

11 months ago
0 Hi Again! I Am Doing Batch Inference From A Parent Task (That Defines A Base Docker Image). However, I'Ve Encountered An Issue Where The Task Takes Several Minutes (Approximately 3-5 Minutes) Specifically When It Reaches The Stage Of "Environment Setup Co

Hi @<1523701205467926528:profile|AgitatedDove14> !
Thanks againg for following up this thread.

Perhaps It is not clear to read the delay in the log, but is just after "Starting Task Execution:"

Environment setup completed successfully
Starting Task Execution:

Here this new entry in the log is 2 min after env completed =>
1702378941039 box132 DEBUG 2023-12-12 11:02:16,112 - clearml.model - INFO - Selected model id: 9be79667ca644d7dbdf26732345f5415

So, the environment is creat...

9 months ago
0 Hi Again! I Am Doing Batch Inference From A Parent Task (That Defines A Base Docker Image). However, I'Ve Encountered An Issue Where The Task Takes Several Minutes (Approximately 3-5 Minutes) Specifically When It Reaches The Stage Of "Environment Setup Co

Hello @<1523701205467926528:profile|AgitatedDove14> , thank you for addressing my concern. It seems that the aspect of avoiding the venv is functioning correctly, and everything within the container is properly configured to initiate. However, there is still a delay of approximately 2 minutes between the completion of setup, thus the appearance of the console log indicating "Starting Task Execution" and the actual commencement of the inference logic. During this period, no additional logs ar...

9 months ago
0 Hi Again! I Am Doing Batch Inference From A Parent Task (That Defines A Base Docker Image). However, I'Ve Encountered An Issue Where The Task Takes Several Minutes (Approximately 3-5 Minutes) Specifically When It Reaches The Stage Of "Environment Setup Co

Hi @<1523701205467926528:profile|AgitatedDove14> would you be so kind to take a look at this issue?
we still have 2 minutes between the log of
"

Starting Task Execution:

"
and actually starting our inference logic. We have no extra info from the log to check to improve this slow task start time.
thanks a lot for any feedback!

9 months ago