Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
48 Questions, 8049 Answers
  Active since 10 January 2023
  Last activity 6 months ago

Reputation

0

Badges 1

25 × Eureka!
0 Is There An Easy Way To Add A Link To One Of The Tasks Panels? (As An Artifact, Configuration, Info, Etc)? Edit: And Follow Up Regarding The Dataset. As Discussed Somewhere Previously, The Datasets Are Now Automatically Moved To A Hidden "Sub-Project" Pr

For now we've monkey-patched it to our usecase:

LOL, that's a cool hack

That gives us the benefit of creating "local datasets" (confined to the scope of the project, do not appear in

Datasets

tabs, but appear as normal tasks within the project)

So what would be a "perfect" solution here?
I think I'm missing the point on why it became an issue in the first place.
Notice that in new versions Dataset will be registered on the Tasks that use them (they are already...

2 years ago
0 Hi Everyone, I Wanted To Inquire If It'S Possible To Have Some Type Of Model Unloading. I Know There Was A Discussion Here About It, But After Reviewing It, I Didn'T Find An Answer. So, I Am Curious: Is It Possible To Explicitly Unload A Model (By Calling

@<1657918706052763648:profile|SillyRobin38> out of curiosity did you compare performance of tensorrt-llm vs vllm ?
(the jury is still out on that, just wondered if you had a chance)

8 months ago
0 Is There An Easy Way To Add A Link To One Of The Tasks Panels? (As An Artifact, Configuration, Info, Etc)? Edit: And Follow Up Regarding The Dataset. As Discussed Somewhere Previously, The Datasets Are Now Automatically Moved To A Hidden "Sub-Project" Pr

Why does ClearML hide the dataset task from the main WebUI?

Basically you have the details from the Dataset page, why should it be mixed with the others ?

If I specified a project for the dataset, I specifically want it there, in that project, not hidden away in some

.datasets

hidden sub-project.

This maybe a request for "Dataset" tab under project, why would you need the Dataset Task itself is the main question?

Not all dataset objects are equal, and perhap...

2 years ago
0 Is There An Easy Way To Add A Link To One Of The Tasks Panels? (As An Artifact, Configuration, Info, Etc)? Edit: And Follow Up Regarding The Dataset. As Discussed Somewhere Previously, The Datasets Are Now Automatically Moved To A Hidden "Sub-Project" Pr

Yes. Because my old

has never been resolved (though closed), we use the dataset object to upload e.g. local files needed for remote execution.

Ohh No I remember... following this line, can I assume these files are reused, i.e. this is not a "per instance" . I have to admit that I have a feeling this is a very unique usecase. and Maybe the "old" way Dataset were shown is better suited ?

No, I mean why does it show up in the task view (see attached image), forcing me to clic...

2 years ago
0 Hey Guys, Sorry For The Rapid Fire Questions In The Past Few Days. I Have Another Issue Though. I Initially Ran A Task, Directly From A Repo. It Succesfully Installed The Requirements From The Requirements File In The Repo And Ran The Task Without Any Iss

You're suggesting that the false is considered a string and not a bool?

The clearml-server always stores the values as strings (serializing them), the casting is done when passed back to the code in runtime. The issue here is there is actually no "way" to tell the argparser this is a boolean (basically any value that will be passed is treated as string). What I think we should do is fix the casting function so that if this is exatcly the same value we use the default value (i.e. boole...

2 years ago
0 Is There An Easy Way To Add A Link To One Of The Tasks Panels? (As An Artifact, Configuration, Info, Etc)? Edit: And Follow Up Regarding The Dataset. As Discussed Somewhere Previously, The Datasets Are Now Automatically Moved To A Hidden "Sub-Project" Pr

A definite maybe, they may or may not be used, but we'd like to keep that option

The precursor to the question is the idea of storing local files as "input artifacts" on the Task, which means that if the Task is cloned the links go with it. Let's assume for a second this is the case, how would you upload these artifacts in the first place?

2 years ago
0 Is There An Easy Way To Add A Link To One Of The Tasks Panels? (As An Artifact, Configuration, Info, Etc)? Edit: And Follow Up Regarding The Dataset. As Discussed Somewhere Previously, The Datasets Are Now Automatically Moved To A Hidden "Sub-Project" Pr

Hmm, maybe the right way to do so is to abuse "models" which have entity, you can specify a system_tag on them, they can store a folder (and extract it if you need), they are on projects and they are cloned and can be changed.
wdyt?

2 years ago
2 years ago
0 Whet Is The Method For Packages Exploration When Using Conda? Agent Is Set To 'Conda' Mode. We Upload A Task From A Local Conda Env That (Obviously) Has Some Pip Packages As Well. When We Enqueue The Task To Run Remotely, Not All Conda Packages Are Instal

Let me try to add some color to this process analysis process.
Basically clearml will try to statically analyze the code (i.e. look for import/from packages)
Then it will list them in a pip requirements.txt format under installed packages.
When running inside conda environment, it will check which packages were installed via "conda install" (instead of pip install) and mark them internally. This process ensures that when the clearml-agent is running with conda package manager, it "knows" whic...

2 years ago
0 Hello ! When Running

I am very confused now, I tried switch to my local machine and change the clearml.conf.
It only partly worked :

Notice that the Dataset.get (...) is downloading an artifact that was uploaded before, basically it gets the full URL and downloads the data. it seems the original dataset uploaded to "localhost:8081", could that be the case?

2 years ago
0 I'M Trying To Understand How Clearml Serving Works And Trying To Set It Up. I Have An Agent Listening To The Serving Queue And I'M Trying To Set Up Clearml Serving To Launch On The Serving Queue. Do I Need To Have Clearml-Serving Installed On The Machine

can you tell me what the serving example is in terms of the explanation above and what the triton serving engine is,

Great idea!

This line actually creates the control Task (2)
clearml-serving triton --project "serving" --name "serving example"
This line configures the control Task (the idea is that you can do that even when the control Task is already running, but in this case it is still in draft mode).
Notice the actual model serving configuration is already stored on the crea...

2 years ago
0 Hi Everyone, Is There A Way To Avoid The Environment Setup When Running A Task Using A Worker? I Am Currently Using A Custom Docker Image That Already Has All The Require Packages Installed. I Tried Setting The Env Var

SteepDeer88
Try the following:
` Task.add_requirements("pycocotools-windows", "; platform_system == "Windows"")
Task.add_requirements("pycocotools", "; platform_system != "Windows"")

Task.init(...) You should see in your "installed packages" something like: pycocotools-windows ; platform_system == "Windows"
pycocotools ; platform_system != "Windows" `

2 years ago
0 Whet Is The Method For Packages Exploration When Using Conda? Agent Is Set To 'Conda' Mode. We Upload A Task From A Local Conda Env That (Obviously) Has Some Pip Packages As Well. When We Enqueue The Task To Run Remotely, Not All Conda Packages Are Instal

It was installed by 'pip install kwcoco' while my conda env was active.

Well I guess my question is, how does conda know ehere to install it form, if this is not on the public channels ? is there a specific conda channel you added (or preconfigured) ?

2 years ago
0 Whet Is The Method For Packages Exploration When Using Conda? Agent Is Set To 'Conda' Mode. We Upload A Task From A Local Conda Env That (Obviously) Has Some Pip Packages As Well. When We Enqueue The Task To Run Remotely, Not All Conda Packages Are Instal

CrookedWalrus33 from the log it seems the code is trying to use "kwcoco" but it is not listed under any "Installed packages" nor do you see any attempt to install it. Can you confirm ?

2 years ago
0 I Seem To Be Missing Something ... I'Ve Only Got One Task Running To Train A Segmentation Model On My Local Machine, And In A Few Days It'S Hit Over 1.15M Api Calls. It Looks Like It'S Sending Every Single Console Output ... Are There Settings To Control

Would love to just cap it at a fixed amount for a month for API calls.

Try the timeout configuration, I think this shoud solve all your issues, and will be fairly easy to set for everyone

one year ago
one year ago
0 I Wanted To Ask About K8S + Clearml-Agent Integration. Details In The Thread.

K8s + clearml-agent integration.

Hmm is this an on-prem k8s cluster?

2 years ago
0 Hey Everyone, I'M Having An Issue Due To Conflicting Git Credentials On The Clearml-Agent (Running Inside The Docker). I'M Using Ssh Settings (

Hi PleasantGiraffe85
Did you set git_host to only point to your host ? do you expect all the git clones to use SSH? how does the requirements.txt git link looks like ?
https://github.com/allegroai/clearml-agent/blob/bf07b7f76d3236c1118b81730c6d9718705a795a/docs/clearml.conf#L22

2 years ago
0 Hi, I'M Facing Some Issues When Try To Run A Pipeline, How Can A Import A Local Library Using Pipelines From Functions? Always Getting "Modulenotfounderror: No Module Named"

Hmm, this means the step should have included the git repo itself, which means the code should have been able to import the .py
Can you see the link to the git repository on the Pipeline step Task ?

2 years ago
0 Hi, We'Re Hosting Clearml On Our K8S Cluster, And I'M Running Into Problems With It... I'Ve Set It Up In A Subdomain Way - App/Files/Api.Clearml.Mydomain... But I Have Some Issues With The Ssl Certificate. When I Try Running

PleasantGiraffe85 you can disable the SSL verification on the client end:
https://github.com/allegroai/clearml-agent/blob/21c4857795e6392a848b296ceb5480aca5f98e4b/docs/clearml.conf#L12
Basically you can just manually create the clearml.comf with only the following:
api { api_server: web_server: files_server: `
credentials {"access_key": "EGRTCO8JMSIGI6S39GTP43NFWXDQOW", "secret_key": "x!XTov_G-#vspE*Y(h$Anm&DIc5Ou-F)jsl$PdOyj5wG1&E!Z8"}

# verify...
3 years ago
0 Hi Team, I'M Currently Trying To Install Clearml-Server On A Powerpc Server With Redhat7. The Issue Is That The Clearml-Server Pre-Built Images Doesn'T Run On The Powerpc, So The Docker Containers Need To Be Rebuild On The Powerpc Host. Is There Dockerfil

Hi Team, I'm currently trying to install ClearML-Server on a Powerpc server with RedHat7.

You are a brave man LividCrab90 !

s there dockerfiles for the ClearML-Server stack somewhere ?

The main issue is replacing the DB containers, do you have elastic/mongo/redis for powerpc ?

3 years ago
Show more results compactanswers