Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8094 Answers
  Active since 10 January 2023
  Last activity 10 months ago

Reputation

0

Badges 1

25 × Eureka!
3 years ago
0 Hi, I'M Trying To Set Storage Manager To Use Our Internal Miniio Installation But I Ran Into This Issue With This Testing Code:

an implementation of this kind is interesting for you or do you suggest to fork

You mean adding a config map storing a default trains.conf for the agent?

4 years ago
0 Hi. Question About Dataset Upload Errors: When Uploading A

Hi PanickyMoth78 an RC with a fix is out, let me know if it works (notice you can now set the max_workers from CLI or Dataset functions) pip install clearml==1.8.1rc1

2 years ago
0 Hi. Question About Dataset Upload Errors: When Uploading A

PanickyMoth78 quick update the fix is already being tested, I'm hoping an RC tomorrow 🙂

2 years ago
0 I Am Trying To Use

No that's okay

4 years ago
0 I Am Trying To Use

yes its the JWT issue

4 years ago
0 Bug?

Yep the automagic only kick in with Task.init... The main difference and the advantage of using a Dataset object is the underlying Task resides in a specific structure that is used when searching based on project/name/version, but other than that, it should just work

2 years ago
0 So, Here'S A Question. Does Clearml Automatically Save Everything Necessary To Continue Training A Pytorch Language Model? Specifically, I'Ve Been Looking At The Checkpoint Folders Created When I'M Training A Huggingface Robertaformaskedlm. I Checked What

Could I use "register artifact"

I think this is somewhat deprecated and we should probably replace it with something similar to what you mentioned (i.e. watch a file change).
Right now the easiest way would e to manually upload the trainer_state.json every checkpoint:
Task.current_task().upload_artifact('trainer_state.json, name='state') `

3 years ago
0 Bug?

I think your use case is the original idea behind "use_current_task" option, it was basically designed to connect code that creates the Dataset together with the dataset itself.
I think the only caveat in the current implementation is that it should "move" the current Task into the dataset project / set the name. wdyt?

2 years ago
0 Hi. I Am Experimenting With

PanickyMoth78

LockException: [Errno 11] Resource temporarily unavailable

I'm not sure I understand how you got to this error (obviously creating datasets and getting them back works), what is unique in the setup/flow itself ?

2 years ago
0 Hello All, I'M Trying To Adapt Clearml With My Workflow. I Installed A Server At My Server, With Workers Attached To It. I'M Trying To Execute A Task From My Local Within One Of My Workers. Trying To Use Docker Mode And A Custom Image. I Also Have A Local

However, this one should be a feature to work on, and should be fairly easy to implement.

Feel free to add as GitHub issue 🙂
Main challenge is understanding what needs to be added as "uncommitted changes"

2 years ago
0 Another Quick Question About Fileservers And Clearml-Agent: Clearml-Agent Seems To Ignore The Output Destination Set In The Task Config

Debug samples can only be controlled via api.file_server (or programatically)
Model/Artifacts see above

This has no effect. I am not able to change the files_sever, e.g. I can not change from

You are Not changing the files_server just where your Taskj uploads Models/Artifacts, these are two diff things (and again Only applies to Artifacts/Models)

one year ago
0 Bug?

I just think that the create function should expect

dataset_name

to be None in the case of

use_current_task=True

(or allow the dataset name to differ from the task name)

I think you are correct, at least we should output a warning that it is ignored ... I'll make sure we do 🙂

2 years ago
0 Can One Compare Experiments/Tasks From Different Projects? Edit: I Mean, I Can Manually Navigate To Some

I meant even just a link to a blank comparison and one can then add the experiments from that view

Just making sure you are aware, once you are in comparison you can always add Tasks (any Task):
Notice you can press on the "Add experiments", then select Any experiment (including all projects! as filters)
Notice you need to remove all filters (right side red x on the filter Icon)

3 years ago
one year ago
0 Bug?

I have a task where I create a dataset but I also create a set of matplotlib figures, some numeric statistics and a pandas table that describe the data which I wish to have associated with the dataset and vieawable from the clearml web page for the dataset.

Oh sure, use https://clear.ml/docs/latest/docs/references/sdk/dataset#get_logger they will be visible on the Dataset page on the version in question

2 years ago
0 Hi, I Would Like To Check What Would Be The Recommended Hardware Specs For The Server Host Clearml Server. I Had One Configured With 32 Cpu Cores, 64Gb Ram And I Noticed That If We Have A Surge In Remote Task Creation, The Following Delays Occurs.

If the only issue is this line
task.execute_remotely(..., exit_process=True)It has to finish the static analysis of the entire repository (which usually happens in the background but now we have to wait for it). If the repo is large this could actually take 20sec (depending on CPU/drive of the machine itself)

3 years ago
0 Whet Is The Method For Packages Exploration When Using Conda? Agent Is Set To 'Conda' Mode. We Upload A Task From A Local Conda Env That (Obviously) Has Some Pip Packages As Well. When We Enqueue The Task To Run Remotely, Not All Conda Packages Are Instal

Let me try to add some color to this process analysis process.
Basically clearml will try to statically analyze the code (i.e. look for import/from packages)
Then it will list them in a pip requirements.txt format under installed packages.
When running inside conda environment, it will check which packages were installed via "conda install" (instead of pip install) and mark them internally. This process ensures that when the clearml-agent is running with conda package manager, it "knows" whic...

3 years ago
Show more results compactanswers