Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ShaggySwan64
Moderator
5 Questions, 15 Answers
  Active since 16 March 2023
  Last activity 6 months ago

Reputation

0

Badges 1

15 × Eureka!
0 Votes
8 Answers
557 Views
0 Votes 8 Answers 557 Views
10 months ago
0 Votes
6 Answers
501 Views
0 Votes 6 Answers 501 Views
Hi everyone, I noticed a weird issue with pipeline tasks. I have a task in my project with configuration that includes several parameters of different types ...
6 months ago
0 Votes
4 Answers
773 Views
0 Votes 4 Answers 773 Views
Hello, are there any plans to add support for pdm package manager? It's what we use since poetry dependcy solver is quite slow and it would be neat to have d...
11 months ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
one year ago
0 Votes
6 Answers
716 Views
0 Votes 6 Answers 716 Views
11 months ago
0 Hi, I'M Experiencing Some Fairly Slow Uploads Of A New Dataset Version. I'M Running A Local Server And I'M Uploading A ~20Gb Update To A ~30Gb Dataset Consisting Of Few Hundreds Files, Each Up To Several Hundred Mbs. It Seems That Compressing And Upload I

On the original 30GB dataset, it took just a few seconds to go from uploading the last chunk of data to "File compression and upload completed" so I find it weird that the upload of the update is hanging indefinitely while processing and without utilizing the disk at all.

11 months ago
0 Hi! Trying To Orchestrate A Pipeline On Multiple Agents For The First Time. I Am Setting Up The Pipeline From Functions And I Need The Individual Function Steps To Run From A Cloned Repository Due To Local Imports. The Pipeline Task Detects And Clones The

I know I can specify the repo manually in the add_funciton_step call but I would like to keep the execution from the parent pipeline task, including uncomitted changes etc.

10 months ago
0 Hi Everyone, I Noticed A Weird Issue With Pipeline Tasks. I Have A Task In My Project With Configuration That Includes Several Parameters Of Different Types (Strings, Ints, Bools). As Soon As I Use This Task To Create A Pipeline Task Using

to clarify, the parameters are typed correctly inside the pipeline task process itself but logged as strings so they need to be cast manually if I am forwarding some parameters using get_parameter

6 months ago
0 Hi! Trying To Orchestrate A Pipeline On Multiple Agents For The First Time. I Am Setting Up The Pipeline From Functions And I Need The Individual Function Steps To Run From A Cloned Repository Due To Local Imports. The Pipeline Task Detects And Clones The

yes, they are. It's basically the same script as the pipeline_from_functions.py example on clearml github but I need to import local modules from a private repository inside the steps.

10 months ago
0 Hi! Trying To Orchestrate A Pipeline On Multiple Agents For The First Time. I Am Setting Up The Pipeline From Functions And I Need The Individual Function Steps To Run From A Cloned Repository Due To Local Imports. The Pipeline Task Detects And Clones The

yes, I only have a single repository. The pipeline and the individual steps are implemented in the same folder but although the controller task runs fine and the repo is cloned on the agent, the function step agents only pull the single .py file.

10 months ago
0 Hello, Are There Any Plans To Add Support For Pdm Package Manager? It'S What We Use Since Poetry Dependcy Solver Is Quite Slow And It Would Be Neat To Have Direct Support In Clearml-Agents As Well. Thanks!

@<1523701205467926528:profile|AgitatedDove14> I'm not really sure about "drop in replacement" but they do support importing poetry projects. It's also extensible with plug-ins (see None ) for controlling the dependency resolution and install process so you could probably find more use cases.

11 months ago
0 Hi, I'M Experiencing Some Fairly Slow Uploads Of A New Dataset Version. I'M Running A Local Server And I'M Uploading A ~20Gb Update To A ~30Gb Dataset Consisting Of Few Hundreds Files, Each Up To Several Hundred Mbs. It Seems That Compressing And Upload I

It does feel like the server is struggling since webUI is also having trouble loading debug sample artifacts during the upload. But I'm not sure why that would be the case. The client console is hanging after "uploading dataset changes" and I can see the fileserver.py process putting load on the server cpu but don't see any files being added or changed on the local fileserver folder. Is there a way to check what is the fileserver doing? I don't see anything suspicious in log.

11 months ago
0 Hello, Are There Any Plans To Add Support For Pdm Package Manager? It'S What We Use Since Poetry Dependcy Solver Is Quite Slow And It Would Be Neat To Have Direct Support In Clearml-Agents As Well. Thanks!

Sure, you can check their github or this somewhat objective comparison to other packaging tools. I personally like that it's more PEP compliant and definitely faster in my personal experience, especially with large packages like torch. Also allows using custom install scripts but I don't have enough experience with other managers to really compare that feature. Definitely not yet as po...

11 months ago
0 Hi, I'M Experiencing Some Fairly Slow Uploads Of A New Dataset Version. I'M Running A Local Server And I'M Uploading A ~20Gb Update To A ~30Gb Dataset Consisting Of Few Hundreds Files, Each Up To Several Hundred Mbs. It Seems That Compressing And Upload I

Huh. So it looks like this was an issue of spawning too many upload workers which overwhelmed the fileserver limited to a single core...? When I limited max_workers in upload() on the client side, it went smoothly with no hanging. Funny thing is I had no issues with this using sync_folder() which I used for the original data upload, hence my perceived difference in performance despite similar file sizes.

11 months ago