Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
48 Questions, 8049 Answers
  Active since 10 January 2023
  Last activity 6 months ago

Reputation

0

Badges 1

25 × Eureka!
0 Hi, I Would Like To Bring Awareness

I am not sure what switching back will solve, here the wheel should have been correct, it's just the architecture of the card that is incompatible

So I tested the "old" code that did the parsing and matching, and it did resolve to the correct wheel (i.e. found that there is no 117 only 115 and installed this one)
I think we should switch back, and have a configuration to control which mechanism the agent uses , wdyt?

one year ago
0 Hi, Anyone Seen This Issue?

MelancholyElk85 notice there is the pipeline controller queue (i.e. which agent will run the logic of the pipeline), and the default queue for the pipeline steps (i.e. the actual steps of the pipeline).
The default queue for the pipeline logic itself is services . you can change it ( pipeline.start(..., queue='another_q') )
Make sense ?

3 years ago
0 Hi, How Can I Remove A Tag From A Task Via Code In A Non-Barbaric Way?

SmarmySeaurchin8
updated_tags = task.tags
updated_tags.remove(tag)
task.tags = updated_tags

3 years ago
0 I'M On The Machine With Clearml Server Hosted. Is There Any Way To See Datasets Uploaded To Clearml Data Without Downloading Them Using Clearml Data?

s there any way to see datasets uploaded to ClearML Data without downloading them using ClearML Data?

Hi VexedCat68
Currently when you create datasets with clearml-data it has to repackage your files, i.e. upload them. That said we have received numerous requests on "registering data", and we are looking into it.
Here is the main technical hurdles we are facing, and I would love to get your perspective:
If the data is not available locally, we cannot calculate the hash of the conten...

2 years ago
0 Hey Community! I Have A Question Regarding The Optuna Optimizer With Clearml. I'M Using A Config Yaml File That I'M Connecting Via

Hi @<1547390438648844288:profile|ScaryJellyfish75>

These hyperpaters are now in the "Args" section of my Clearml task

Sure that would probably mean

UniformParameterRange(
                "Args/training/optimizer/lr",
                min_value=0.00025,
                max_value=0.01,
                step_size=0.00025,
            ),

assuming your Task has training/optimizer/lr in its Args section (under configuration tab), make sense ?

one year ago
0 "Clearml-Data Sync --Folder ." Doesn'T Work

However, once I extract the zips (or download the dataset through Python API or CLI) not all the files are there.

and all the files are registered in the metadata? coulf you add --verbose to the sync command to see what it is doing

"clearml-data add --folder ./*" seems to fix this issue though it doesn't preserve my directory structure

This is also odd, it should Not flatten the folder structure. What is your OS / Python / clearml version?
Is this reproducible ? if so, how ...

3 months ago
0 Any Chance Storagemanager Could Re-Download Files Only If Their Size Is Different From File In Cache (As An Option)?

Yes, that sounds like a good start, DilapidatedDucks58 can you open a github issue with the feature request ?
I want to make sure we do not forget

2 years ago
0 Hey, I'M Running A Pipeline, And 1 Stage Passed - But The Next One Failed. I Fixed The Bug For The Second One - Is There Any Way To Retry The Pipeline From The Failure?

The pipeline stores the state of it's previous run, specifically the executed steps.
In our case the executed step was reset (I assume) so it cannot find the output model you are referring to, hence crashing
CleanPigeon16 make sense ?

3 years ago
0 Hey, I'M Running A Pipeline, And 1 Stage Passed - But The Next One Failed. I Fixed The Bug For The Second One - Is There Any Way To Retry The Pipeline From The Failure?

CleanPigeon16 Coming very soon, we adding a few features for the pipeline, this one will also be included :)

3 years ago
0 Hey All, We Are Trying To Clone A Task That Uses Custom Pip Installed Packages And Run It Via An Agent. When Running Locally, We Simply “

@<1523701079223570432:profile|ReassuredOwl55> did you try adding manually ?

./path/to/package

You can also do that from code:

Task.add_requirements("./path/to/package")
# notice you need to call Task.add_requirements before Task.init
task = Task.init(...)
one year ago
0 Hi I Have A Most Probably A Beginer Question Abour Loading The Data In Pycharm And Later On In Google Colab From An Dataset From Clearml. I Used From Page:

Hi @<1651395720067944448:profile|GiddyHedgehong81>

However I need for a yolov8 (Object detection with arround 20k jpgs and .txt files) the data.yaml file:

Just add the entire folder with your files to a dataset, then get it in your code
Add files (you can do that from CLI for example): None

clearml-data add --files my_folder_with_files

Then from code: [Non...

9 months ago
0 Hi I Have A Most Probably A Beginer Question Abour Loading The Data In Pycharm And Later On In Google Colab From An Dataset From Clearml. I Used From Page:

@<1651395720067944448:profile|GiddyHedgehong81> just to be clear, Dataset.get_local_copy returns a path to your files,
You have to Manually add the additional path to the specific files you need to use. It does Not know that in advance.
That was the initial issue you had, and I assume it is the same one here. does that make sense ?

9 months ago
0 "Clearml-Data Sync --Folder ." Doesn'T Work

Hi @<1631102016807768064:profile|ZanySealion18>
sorry missed that one

The cache doesn't work, it attempts to download the dataset every time.

just making sure the dataset itself contains all the files?

Once I used clearml-data add --folder * CLI everything works correctly (though all files recursively ended up in the root, I had luck all were named differently).

Not sure I follow here, is the problem the creation of the dataset of fetching it? is this a single version or multi...

3 months ago
0 I Have A Local Folder A, And A Dataset B. A:

so moving b in to a won’t work if some subfolders are already there

I though that if they are already there you would merge / overwrite, isn't that what you need ?
a/b/c/2.txt seems like the result of moving b from dataset B into folder b in Dataset A, what am I missing?
(My assumption is that you have both datasets locally on the same machine and that you can just copy the files from b of Datasset B into b folder of Dataset A)

2 years ago
0 Hi. Help

Hi PanickyMoth78

I had several pipeline components getting it and uploading files to is concurrently.

Should not be a problem

I've attached it's log file which only mentions skipping one file (a warning)

So what exactly is the error you are getting?

2 years ago
2 years ago
0 Hi. Help

at least you did not change permission of your K8s etcd folder πŸ˜„

2 years ago
0 Hi. Help

No worries πŸ™‚

2 years ago
0 Fatal: Could Not Read From Remote Repository. Please Make Sure You Have The Correct Access Rights And The Repository Exists.

in order to work with ssh cloning, one has to manually install openssh-client to the docker image, looks like that

Correct, you have to have SSH inside the container so that git can use it.
You can always install with the following setup inside your agent's clearml.conf:
extra_docker_shell_script: ["apt-get install -y openssh-client", ]
https://github.com/allegroai/clearml-agent/blob/73625bf00fc7b4506554c1df9abd393b49b2a8ed/docs/clearml.conf#L145

2 years ago
0 Happy Friday Everyone

Hi RobustRat47
the easiest way to reproduce the entire environment on you local machine:
clearml-agent build --id <task_id> --target ~/debug-full-env/This will install an entire venv including code and applying git changes:
You can also create a container with everything:
https://clear.ml/docs/latest/docs/clearml_agent#task-container

2 years ago
0 Any Chance Storagemanager Could Re-Download Files Only If Their Size Is Different From File In Cache (As An Option)?

any chance StorageManager could re-download files only if their size is different from file in cache (as an option)?

I think there is force argument, to force download.
I think the main issue is getting the size from different backends (i.e. s3 /https / etc.)
Maybe we should add it as a GitHub feature request issue?
The main limitation is that the driver "list()" does not return file size.
For example it might be an issue with the default http files-server.
wdyt?

2 years ago
0 Hi All, I'M Starting To Use Clearml, For Experiment Management On This Step. I'M Using Voxel51 (

Correct πŸ™‚
btw: my_dict_with_conf_for_data can be any object, not just dict. It will list all the properties of the object (as long as they do not start with _)

2 years ago
0 Hi Everyone!

You mean one machine with multiple clearml-agents ?
(worker is a unique ID of an agent, so you cannot have two agents with the exact same worker name)
Or do you mean two agents pulling from the same queue ? (that is supported)

one year ago
0 Heyo, After Building Some Custom Pipelining Functionality On Mlflow, I Started Looking For Better Software That Can Beat What I Created - With A Similar Amount Of Effort. Problem Has Been That Up Till Now, All I Found Could Make Things Way Better But Al

Thanks ContemplativePuppy11 !

How would you pass data/args between one step of the pipeline to another ?
Or are you saying the pipeline class itself stores all the components ?

one year ago
0 Our Mac Users Are Having Some Issues. They Have Their Respective ~/Clearml.Conf, And Yet They Get: Clearml 1.1.5

Are they expanded in the "api_server" ? (I verified on a linux machine, same error, the env in the api_server is not being resolved)

2 years ago
0 Hello! Thank You All For Your Work! I Have A Question (Which Is Probably Not Clearml Related At All). I Am Using Clearml-Agent Running In Docker Mode On Several Machines With Gpu In Our Local Network And Get Different Behaviour Depending On How I Logged I

BurlyRaccoon64 by default if .ssh exists in the host user folder it should mount it to the container (actually mount a copy of it). do you have a log of two tasks from two diff machines, one failing one passes? because this is quite odd (assuming the setup itself is identical)

2 years ago
Show more results compactanswers