Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
48 Questions, 8049 Answers
  Active since 10 January 2023
  Last activity 5 months ago

Reputation

0

Badges 1

25 × Eureka!
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
https://m.facebook.com/story.php?story_fbid=2484620658505570&id=1620822758218702&refid=52&tn=-R
4 years ago
0 Votes
0 Answers
969 Views
0 Votes 0 Answers 969 Views
Gals, Guys & :robot_face: If you want to get some inspiration on building DL Continuous Integration pipelines, I suggest this post (obviously built on top of...
4 years ago
0 Votes
0 Answers
968 Views
0 Votes 0 Answers 968 Views
4 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
apparently everyone can ...
4 years ago
0 Votes
3 Answers
972 Views
0 Votes 3 Answers 972 Views
Hi
Hi , v0.15 is out, 🎉 🚀 Your feedback had a major influence on the features we added 🙂 thank you! A selected list of features: Column resizing / ordering /...
4 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
YummyWhale40 awesome thanks!
4 years ago
0 Votes
0 Answers
941 Views
0 Votes 0 Answers 941 Views
4 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
YEY!!!! Download as CSV 🤯
2 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
4 years ago
0 Votes
0 Answers
987 Views
0 Votes 0 Answers 987 Views
3 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
Is it a one time thing? or recurring?
4 years ago
0 Votes
2 Answers
949 Views
0 Votes 2 Answers 949 Views
Hi
Hi ! trains 0.16.2 is finally out with the new pipelines interface! Check out the new example https://github.com/allegroai/trains/blob/master/examples/pipeli...
3 years ago
0 Votes
0 Answers
966 Views
0 Votes 0 Answers 966 Views
important notice : it seems Nvidia broke some of their PPA's security 😕 , causing apt-get updates to fail inside containers. This in term will cause clearml...
2 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
I would guess connectivity issues, the TLS is probably python inaccurate response (I mean in a way, it is also a TLS error, but I would imagine this has more...
4 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
This is usually due to enterprise level issued https certificates not part of the local installation (basically any python generated SSL request will fail)
4 years ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
New RC for trains-agent is out pip install trains-agent==0.13.2rc1
4 years ago
0 Votes
3 Answers
369 Views
0 Votes 3 Answers 369 Views
@<1523703325881536512:profile|ConvolutedSealion94> these are xgboost internal metrics that are automatically picked by clearml
one year ago
0 Votes
0 Answers
1K Views
0 Votes 0 Answers 1K Views
you set it 🙂
4 years ago
Show more results questions
0 I'Ve Been Trying To Use The

Hi @<1610808279263350784:profile|FriendlyShrimp96>

Is there a way to get a list of variants given a metric, or even just a full list of metrics and variants for a given task id?

Try this
None

from clearml.backend_api.session.client import APIClient

c = APIClient()
metrics = c.events.get_task_metrics(tasks=["TASK_ID_HERE"], event_type="training_debug_image")
print(metrics)

I think API ...

one year ago
0 Hi All, I Use .Get_Local_Copy() To Get A Local Copy For Each Of My Artifacts Logged In A Task. I Currently Have 160 Files Which I Want To Get A Local Copy. Each Artifact Is A Numpy Array (.Npz File) Uploaded Using .Upload_Artifact() Before. When I Run .Ge

Hi ScatteredClams84

Is there any parameter that adjusts the "number of files that can be stored in the cache"? I am using clearml python version 1.0.3 to upload artifacts and get the artifacts back from a task. (edited)

Yes you are correct, the default value is 100 entries.
You can configure it in the clearml.conf, just add:
sdk.storage.cache.default_cache_manager_size = 1000or from code:
` from clearml.storage.cache import CacheManager
CacheManager.get_cache_manager(cache_file_...

3 years ago
0 Hello, I Am Trying To Run Some Algorithm In My Docker Container With Clearml Task . But The Algorithm Uses Ros, So I Need Somehow To Setup Environment Before Run It And Launch

LazyFish41 just making sure, you built a container from the docker file, and used it as base docker image for the Task, is that correct ?
Also notice the cleaml-agent will not change the entry point of the docker meaning if the entry point does not end with plain bash, it will not actually run anything

3 years ago
0 On A Separate Note, That Embed Seems To Be Slightly Off With Regards To Where The Link Is Actually Pointing To, Which Is

that embed seems to be slightly off with regards to where the link is actually pointing to

I think this is the Slack preview... 😞

10 days ago
0 Hey, Is There A Shortcut On The Dataset Sdk To Directly Get The Latest Version Of A Dataset ?

Hi FierceHamster54
Sure just do
dataset = Dataset.get(dataset_project="project", dataset_name="name")This will by default fetch the latest version

one year ago
0 I Have 5 Unarchived Pipeline Runs That Were Defined With This Decorator:
  • Maybe we should add an option, archive components as well ...
2 years ago
0 Hi, We’Re Deploying Clearml On The Eks And Have An Issue With Authenticating The Server With The S3 Bucket. The Connection To S3 Bucket Is Not Working. Our Current Diagnosis: Clearml Internally Uses Aws_Access_Key_Id And Aws_Secret_Access_Key. But We A

The pod has an annotation with a AWS role which has write access to the s3 bucket.

So assuming the boto environment variables are configured to use the IAM role, it should be transparent, no? (I can't remember what the exact envs are, but google will probably solve it 🙂 _

AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_SESSION_TOKEN. I was expecting clearml to pick them by default from the environment.

Yes it should, the OS env will always override the configuration file sect...

2 years ago
0 Hi, I Noted That Clearml-Serving Does Not Support Spacy Models Out Of The Box And That Clearml-Serving Only Supports Following;

Hi SubstantialElk6

noted that clearml-serving does not support Spacy models out of the box and

So this is a good point.

To add any pissing package to the preprocessing docker you can just add them in the following environment variable here: https://github.com/allegroai/clearml-serving/blob/d15bfcade54c7bdd8f3765408adc480d5ceb4b45/docker/docker-compose.yml#L83
EXTRA_PYTHON_PACKAGES="spacy>1"
Regrading a custom engine, basically this is supported with --engine custom
you c...

2 years ago
0 Hi Everyone, I Have Questions Related To Clearml-Serving.

I have timeseries dataset with dimension 1,60,1 which the first dimension is number of data, the second one is timestep

I think it should be --input-size 1 60 ` if the last dimension is the batch size?
(BTW: this goes directly to Triton configuration, it is the information Triton needs in order to run the model itself)

2 years ago
0 Why Does My Task Execution Freeze After Pip Installation (Running Agent In Foreground Mode)? The Task Is:

AdventurousButterfly15 this one is quite self container:
https://github.com/allegroai/clearml/blob/master/examples/reporting/scalar_reporting.py

So I guess pip install finished working
But the task is evidently not being executed.

This is very odd ... you can run the agent with debugging with --debug --foreground to see all the outputs and logs

one year ago
0 Hello Everyone I’M New To Clearml And I’Ve Faced One Issue With Models Auto Logging. Tried To Google An Answer Online, But Failed.. Generally I Use The Most Simple

Hi GreasyRaven35
You should set the output_uri, in Task init, it will auto upload the model, and register the remote location URL
task = Task.init(..., output_uri=True)You can also specify a target bucket, if you configured credentials (e.g. output_uri=" s3://bucket ")

2 years ago
0 Okay, 3Rd Question In A Row Here, You Guys Are So Helpful Here!! Okay So Is There Some Kind Of Script That Launches When Say You "Publish" An Experiment So That You Can Get The

so that you can get the latest artifacts of that experiment

what do you mean by " the latest artifacts "? do you have multiple artifacts on the same Task or s it the latest Task holding a specific artifact?

2 years ago
0 Regarding The “Classic” Datasets (Not Hyper Datasets): Is There An Option To Do Something Equivalent To Dvc’S “

you can run md5 on the file as stored in the remote storage (nfs or s3)

s3 is implementation specific (i.e. minio weka wassaby etc, might not support it) and I'm actually not sure regrading nfs (I mean you can run it, but it actually means you are reading the data, that said, nfs by definition I'm assuming is relatively fast access)
wdyt?

2 years ago
0 Hello Everyone! The Question About Dataset.Squash(). The Squash Operation Copies All The Data And Is No Longer Linked To Previous Commits? I Thought This Operation Is Like Git Squash But It Seems To Me That Clearml Dataset.Squash() Create Just A Copy Of S

Hi

The Squash operation copies all the data and is no longer linked to previous commits?

Yes, basically the idea is if you have data version that relies on many parents that needs to be merged, the squash will create a merged copy and push it all as a single version, and then yes the parent versions are no longer needed

I thought this operation is like git squash but it seems to me

yeah... we did not want to actually delete the parents because unlike git, the operation is done ...

6 months ago
0 Hi! I Deployed Clearml With Pre-Built Docker Images. Everything Works Perfectly Until I Put Another Nginx On Top Of The Whole Thing So That I Can Expose It To Internet Through Https. I Turned On The Sub Path In Docker-Compose.Yml, So The Webui Is Exposed

Hi @<1684735407637401600:profile|WonderfulJellyfish65>

BTW, the training script connects to apiserver via the internal IP address

That is a big issue, because as you noticed the links to data =generated by the code will have the internal IP ...

You basically need every component to use the same address (url)

6 months ago
0 Hi, What Happens Exactly When I Execute The Following Command:

NVIDIA_VISIBLE_DEVICES=0,1
Basically it is uses "as is" and Nvidia drivers do the rest
Same goes for all or 0-3 etc.

4 years ago
0 I Would Like To Understand The Limitations Of

My question is what happens if I launch in parallel multiple doit commands that create new Tasks.

Should work out of the box.

I would like to confirm that current_task ...

Correct.

3 years ago
0 Hey! I Would Like To Connect To Same Task From Multiple Consumer And Upload Debug Image. Is It Possibile? It Seems Like I Can Connect To The Task. Get The Logger But Nothing Is Uploaded.

Should work out of the box, as long as the task was started. You can forcefully start the task with:
task.mark_started()

4 years ago
2 years ago
0 Greetings Everyone, In The Course Of My Work, I Utilize A Particular Library That Necessitates More Than Just A Simple Clone And Dependency Installation Procedure. It Also Requires The Cloning Of An Additional Repository, Along With Its Installation, And

Oh if this is the case, then by all means push it into your Task's docker_setup_bash_script
It does not seem to have to be done after the git clone, the only part the I can see is setting the PYTHONPATH to the additional repo you are pulling, and that should work.
The main hurdle might be passing credentials to git, but if you are using SSH it should be transparent
wdyt?

one year ago
Show more results compactanswers