Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8126 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

25 × Eureka!
2 years ago
0 I’M Trying To Use Minio With Clearml As A External Storage. I Am Having Problems With The Configuration File For The Clearml Client When I Use The Output_Uri Parameter Of Task.Init What Do I Put There? I Am Currently Doing Task.Init(… Output_Uri=“S3://I

@<1538330703932952576:profile|ThickSeaurchin47> can you try the artifacts example:
None
and in this line do:

task = Task.init(project_name='examples', task_name='Artifacts example', output_uri="
")
2 years ago
0 I’M Trying To Use Minio With Clearml As A External Storage. I Am Having Problems With The Configuration File For The Clearml Client When I Use The Output_Uri Parameter Of Task.Init What Do I Put There? I Am Currently Doing Task.Init(… Output_Uri=“S3://I

clearml python version: 1.91

could you upgrade to 1.9.3 and try?

Minio is on the same server and the 9000 and 9001 ports are open for tcp

just to be clear, the machine that runs your clearml code can in fact access the minio on port 9000 ?

I tested with the latest and everything seems to work as expected.
BTW: regrading "bucket-name" , make sure it complies with the S3 standard, as a test try to change it to just "bucket" bi hyphens

2 years ago
0 Hi

Yey! BTW: what the setup you are running it with ? does it include "manual" tasks? Do you also report on completed experiments (not just failed ones)? Do you filter by iteration numbers?

5 years ago
0 If I Have A Dataset And I Process It And I Want The Processed Data As Another Dataset, Is Parent The Right Approach?

Parent makes sense if you are changing the data of the parent version, but some data is preserved. Which will make the delta-based storage only store the diff.
If everything is different, and you call sync for example, then it will not reference any previous "snapshot", so there will be no redundancy in storage, but you still get a pointer to the "parent" version.
Make sense ?

4 years ago
0 What Happens To File That Are Downloaded To A Remote_Execution Via Storagemanager? Are They Removed At The End Of The Run, Or Does It Continuously Increases Disk Space?

Hmm, so what I'm thinking is "extending" the capabilities of the "configuration" section (as it seems this is the right context). Allowing to upload a bunch of files (with the same mechanism as artifacts), as zip files, in the configuration "editable" section have the URL storing the zip, together with the target folder. wdyt?

3 years ago
0 Getting This Error At

No worries πŸ™‚

4 years ago
0 Hello Everyone! I Have A Problem With Clearml. Could You Please Help Me? I Have 2 Little Projects With Total 31 Experiments. And Its 837Mb Metric Stored. Where Can I Find A Detail Information About This Memory Quota Spending? I Really Don'T Understand, Wh

Oh I see, yes the "metrics" include both scalars / plots & console outputs,
I also think they are updated only once a day (or maybe twice a day?) so even if you delete them it will take to update
(archive is not delete, you then need to go to the archived view and delete it from there)

one year ago
0 Good Morning, I'M Wondering If Someone Has Any Advice/Experience Configuring Clearml-Agent To Include Private Packages From Aws Codeartifact? So Far I Know I Have To Edit The

Is there a way to detect the repository when initialising a task?

SuperficialGrasshopper36 This should have happened automatically when you call Task.init()

4 years ago
0 I Know I Can Run This Manually In Step By Step But Wondering If This Can Be Automated As Scheduled Tasks

DAG which get scheduled at given interval and

Yes exactly what will be part of the next iteration of the controller/service

an example achieving what i propose would be greatly helpful

Would this help?
from trains.automation import TrainsJob job = TrainsJob(base_task_id='step1_task_id_here') job.launch(queue_name='default') job.wait() job2 = TrainsJob(base_task_id='step2_task_id_here') job2.launch(queue_name='default') job2.wait()

5 years ago
0 What’S The Point Of Tracking Artifacts Dynamically?

Registering some metadata as a model doesn’t feel correct to me.

Yes I'm with you πŸ™‚
BTW what kind of meta-data would need versions during the life time of a Task ?

4 years ago
0 Hi All! Let'S Say I Have Two Functions Decorated With

Only those components that are imported in the script where the pipeline is defined would be included in the DAG plot, is that right?

Actually the way it works currently (and we might change it if there is a better way), every time you call PipelineDecorator.component a new component is stored on the Pipeline Task, which is later translated into DaG graph and Table (next version will have a very nice UI to display / edit them).
The idea is first to have a representation of the p...

4 years ago
0 So I Bumped Onto This Comparison Shared By Dagshub. It Kinda Placed Clearml Is A Rather Bad Position Compared To Everything Else In The Industry.

Please feel free to do so (always better to get it from a user not the team behind the product πŸ˜‰ )

4 years ago
0 Hi, I Try To Optimize My Hyperparamters With

Hmm ConvincingSwan15

WARNING - Could not find requested hyper-parameters ['Args/patch_size', 'Args/nb_conv', 'Args/nb_fmaps', 'Args/epochs'] on base task

Is this correct ? Can you see these arguments on the original Task in the UI (i.e. Args section, parameter epochs?)

4 years ago
0 Hi. Is It Possible To Run Pipelines Clearml Using Yaml Manifests Like Kubeflow Style?

In regards to the YAML how would you pass data? Like the pipeline from tasks example?

3 years ago
0 Quick Question About Concurrency And The Serving Pipeline, If I Have Request A Sent And Its Being Processed, And Then Send Request B While A Is Processing, Will The Serving Pipeline Start Processing (I.E. Run

Hi @<1547028116780617728:profile|TimelyRabbit96>
It should process the new request A (this is a multi threading / async implementation)
Is this consistent with what you are seeing ?

one year ago
0 With

So, what I am referring to is the ability of a system to allow some rigor and robustness of tracking of experiments, and also enforcing some thoughts on how things might be deployed, early on in the development process, whilst not being overly prescriptive and cumbersome

I'm cannot agree more!!
VivaciousPenguin66 We are working on trying to better understand how to solve this very delicate act of balance and offer some sort of "JIRA" for ML.
If this is okay with you, once product pe...

4 years ago
0 What Is Being Stored Exactly In

Ohh... I would not delete them then ... 😞
Maybe kind of heuristics (files created a week ago can be deleted?!)

3 years ago
0 Hi! I Have Local Minio Setup, Via Minio Browser I Can Upload 50-100 Mb Per Second As Its Local. But When I Try To Use Task.Upload_Artifact It Uploads 500 Kb Per Second. Does Anyone Have An Idea About This?

Thanks MuddyCrab47 !!!
I found it!
It turns out the artifact upload will always upload from stream (aka no multi-upload). I will make sure we fix it in the next RC (I think the plan is to have it out this weekend)

5 years ago
0 I Have Used Aws S3 And Minio As Storage For Clearml Artifacts. But Has Anyone Used Nexus As A Storage ?

DeliciousBluewhale87 basically any solution that is compliant with S3 protocol will work. An example:
output_uri=" :PORT/bucket/folder"Are you sure Nexus supports this protocol ?
I "think" nexus sits on top of a storage solution (like am object storage), meaning we can use the same storage solution Nexus is using.
Just to clarify we do not support the artifactory protocol Nexus provides for storing models/artifacts. But we do support it as a source for python packages used by the a...

4 years ago
2 years ago
Show more results compactanswers