Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
48 Questions, 8049 Answers
  Active since 10 January 2023
  Last activity 6 months ago

Reputation

0

Badges 1

25 × Eureka!
0 When I Tried To Create A Clearml Serving Inference Endpoint For Yolov8, I Received The Following Error:

Hi @<1593413673383104512:profile|MiniatureDragonfly17>
These are the specific model input/output layers name.
The way Triton analyses PyTorch model is usually
input__0 then input__1 for the input layers and output__0 and so on for the results:
You can see an example here:
None

--input-size 1 28 28 --input-name "INPUT__0" --input-type float32    --output-size -1 10 --output-name "OUTPUT__0" --outpu...
one year ago
0 Do I Understand Correctly, That Running

Hmmm are you saying the Dataset Tasks do not have the "dataset" system_tag as well as the type ?

3 years ago
0 How Come

Is this a common case? maybe we should change the run_pipeline_steps_locally argument to False?
(The idea of run_pipeline_steps_locally=True is that it will be easier to debug the entire pipeline on the same machine)

3 years ago
0 Hi

GrievingTurkey78
Both are now supported, they basically act the same way 🙂
and log overrides + the final omegaconf

3 years ago
0 Hi, I Am Running Clearml Agent Using Sdk. When I Run A Remote Job On This Clearml Agent, The Venv Setup Is Totally Based On My Requirements.Txt Instead Of Adding On To What The Image Has Before. Why?

how can I start up the clearml agent using the clearml-agent image instead of SDK?

Not sure I follow, what do you mean instead of the SDK? and what is the "clearml-agent image" ?

one year ago
0 Hi All

In the UI, no 😞
From code, Yes 🙂

3 years ago
0 Hello Everyone! The Question About Dataset.Squash(). The Squash Operation Copies All The Data And Is No Longer Linked To Previous Commits? I Thought This Operation Is Like Git Squash But It Seems To Me That Clearml Dataset.Squash() Create Just A Copy Of S

Hi

The Squash operation copies all the data and is no longer linked to previous commits?

Yes, basically the idea is if you have data version that relies on many parents that needs to be merged, the squash will create a merged copy and push it all as a single version, and then yes the parent versions are no longer needed

I thought this operation is like git squash but it seems to me

yeah... we did not want to actually delete the parents because unlike git, the operation is done ...

7 months ago
0 Hi, Expanding On

After it finishes the 1st Optimzation task, what's the next job which will be pulled ?

The one in the highest queue (if you have multiple queues)
If you use fairness it will pull in round robin from all queues, (obviously inside every queue it is based on the order of jobs).
fyi, you can reorder the jobs inside the queue from the UI 🙂
DeliciousBluewhale87 wdyt?

3 years ago
0 Hi There, I Have A Problem With Pyjwt: I Am Using

Sure. JitteryCoyote63 so what was the problem? can we fix something?

3 years ago
0 Does Trains 0.16 Supports Pip >=20.2?

JitteryCoyote63 is this still an issue?

4 years ago
0 For The Frameworks Which Are Supported In Built, Trains Stores The Trained Model As Output Model E.G. For Xgboost Here

PompousParrot44 the fundamental difference is that artifacts are uploaded manually (i.e. a user will specifically "ask" to upload an artifact), models are logged automatically and a user might not want them uploaded (imagine debugging sessions, or testing).
By adding the 'upload_uri' arguments, you can specify to trains that you want all models to be automatically uploaded (not just logged).
Now here is the nice thing, when running using the trains-agent, you can have:
Always upload the mod...

4 years ago
0 I’M Trying To Use Minio With Clearml As A External Storage. I Am Having Problems With The Configuration File For The Clearml Client When I Use The Output_Uri Parameter Of Task.Init What Do I Put There? I Am Currently Doing Task.Init(… Output_Uri=“S3://I

@<1538330703932952576:profile|ThickSeaurchin47> can you try the artifacts example:
None
and in this line do:

task = Task.init(project_name='examples', task_name='Artifacts example', output_uri="
")
one year ago
0 And One More Question. How Can I Get Loaded Model In Preporcess Class In Clearml Serving?

ohh AbruptHedgehog21 if this is the case, why don't you store the model with torch.jit.save and use Triton to run the model ?
See example:
https://github.com/allegroai/clearml-serving/tree/main/examples/pytorch
(BTW: if you want a full custom model serve, in this case you would need to add torch to the list of python packages)

2 years ago
0 Hello Everyone ! I Am Solving The Following Case: Let'S Say We Have A

Hi ExasperatedCrocodile76
This is quite the hack, but doable 🙂
`
file_path = task.connect_configuration(name = 'augmentations', configuration = 'augmentations.py')

import importlib

module_name = 'augmentations'

spec = importlib.util.spec_from_file_location(module_name, file_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module) `
https://stackoverflow.com/a/54956419

one year ago
0 Do You Have Any Base Image Recommendation To Install Clearml Python Library? I'M Getting Error With Pip On Python:3.9.11-Alpine Image.

actually no it is not, alpine is Not a good baseline, is is very very slim missing a ton of stuff.
I would use bullseye or slim (depending how many aux things you need on the container)
https://hub.docker.com//python/tags?page=1&name=bullseye
https://hub.docker.com/
/python/tags?page=1&name=slim-bullseye

one year ago
0 Hello! I Add To Inject The Configuration Into Clearml With

GloriousPanda26 wouldn't it make more sense that multi run would create multiple experiments ?

3 years ago
0 Hello Everyone! I'M Currently Trying To Set Up A Pipeline, And Am A Bit Confused At A Few Things. Some Questions I Have:

When I'm setting up my Pipeline, I can't go "here are some brand new tasks, please run them",

I think this is the main point. Can you create those Tasks via Task.create and get what you want? If so, then sure you can do that:
` def create_step_task(a_node):
task = Task.create(...)
return task

pipe.add_step(
name="stage_process",
parents=["stage_data"],
base_task_factor=create_step_task
) `wdyt?

As for the node, this confusing bit is that this is text from the docs...

one year ago
0 I Am Trying Pytorch Nightly Again With Python 3.10. Works Fine Locally, But Fails On Clearml-Agent In Docker Mode.

seems like pip 20.1.1 has the issue, but >= 22.2.2 do not.

Notice we changed the value there, it now has two versions, pne for python 3.10 < and one for python 3.10>=
The main reason is that pip changed their resolving algorithm, and the new one can break its own dependencies (i.e. pip freeze > requirements.txt -> pip install might not actually work)
None

one year ago
0 Hi, I Am Saving Plt Chart To Clearml Using

Hmm, and when you zoom out, still cropped ?

3 years ago
0 Hi

Oh sure that makes sense, clone the experiment in the UI (right click, clone) then everything is editable :) both uncommitted changes, and branch / commit

2 years ago
0 Hi All, I Have An Issue With The Way Hyper Parameters Are Logged Under Configuration, The Values That Are Stored Seem To Add Unnecessary Escape Characters To The Original Values.. Is It A Known Issue? Is There A Way To Change It? Thanks

Sorry found the code on the Task, duh 🙂
` # get_ipython().magic('pip install clearml')
import clearml
from clearml import Task
task = Task.init(project_name='examples', task_name='test param', reuse_last_task_id=False)
param = {
'tuple_double_quotes_r': (r"value\blah", 1),
'tuple_double_quotes': ("value\blah", 1),
'tuple_single_quotes': ('value\blah', 1),
"double_quotes_r": r"value\blah",
'double_quotes': "value\blah",
'single_quotes': 'value\blah'
...

3 years ago
0 One More Thing, I'M Trying To Take Full Advantage Of The Controller, But I Run Into A Problem In My Use Case. The Controller Is Super Useful For Creating A Dag Of Tasks Which Is A Behaviour Of Interest. But Issues Rise When The Tasks Are Changing. Not On

That is exactly that, the trains-agent is replicating the code from the git repo, and trying to apply the git diff (see uncommitted changes section). Obviously it failed 🙂

4 years ago
0 What Is

BTW: if you need you can do the following:
` from clearml import Task
from clearml.automation import PipelineController

task = Task.init(project_name='pipelines', task_name='pipeline test')
task.set_base_docker(...)

the pipeline object is using the Current Task, hence docker image is set

pipe = PipelineController(...)

pipe.start() `

3 years ago
Show more results compactanswers