Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
48 Questions, 8048 Answers
  Active since 10 January 2023
  Last activity 5 months ago

Reputation

0

Badges 1

25 × Eureka!
0 I Am Running Trains=0.16.4 Python==3.7.5 , And Notice That The "Log" Page Sometimes Didn'T Capture The Console Log From My Program. Is This A Known Issue, Anyone Have Experienced Similar Behavior?

This works.
great!

So it is still in master and should be included in 1.0.5?

correct, RC will be released soon with this fix included

3 years ago
one year ago
0 Hi. Question About Dataset Upload Errors: When Uploading A

Hi PanickyMoth78
Yes i think you are correct, this looks like gs throttling your connection. You can control the number of concurrent uploads with max_worker=1
https://github.com/allegroai/clearml/blob/cf7361e134554f4effd939ca67e8ecb2345bebff/clearml/datasets/dataset.py#L604
Let me know if it works

one year ago
0 Is There Any Way To Get Just One Dataset Folder Of A Dataset? E.G. Only "Train" Or Only "Dev"?

Lately I've heard of groups that do slices of datasets for distributed training, or who "stream" data.

Hmm so maybe a "glob" alike parameter for get_local_copy(select_filter='subfolder/*') ?

3 years ago
0 Hello! When Trying To Use Clearml Datasets With Google Cloud Storage With The Authorized User Credentials It Will Fail And Say Some Fields Are Missing From The Json. This Isn'T An Issue If The User Is Using A Service Account Json Key, Is A Service Account

Hi ShortElephant92

This isn't an issue if the user is using a Service Account JSON Key,

Are you saying that when you are using GS python sdk directly it works?

For context, the google cloud storage SDK allows an authorized user credentials.

ClearML actually uses the google python SDK, the JSON is just a way to pass the credentials to the google SDK, I'm not sure it points to "service account"? where did that requirement came from ?
is it from here ` Service account info was n...

one year ago
0 Hi All, I Have A Broad Question On How A

OutrageousGrasshopper93
tensorflow-gpu is not needed, it will convert tensorflow to tensorflow-gpu based on the detected cuda version (you can see it in the summary configuration when the experiment sins inside the docker)

How can i set the base python version for the newly created conda env?

You mean inside the docker ?

3 years ago
0 Hello Community, I Had A Query Regarding Clearml-Data , Can The Dataset Be Queried Against Some Metadata Using Ui And/Or Cli ?

HarebrainedBear62 this is what I have.
clearml-data will store all the files for you, and version the entire thing, make is a breeze to abstract the dataset from the code. Querying data is available using Apache Drill (though currently it is still not built into the platform, but we are planning to get there soon) Since this is Image based data/meta-data, I know the paid tier of ClearML, has n additional dedicated data management solution specifically for images, with full ability to query m...

3 years ago
0 Hi All, I Have An Issue With The Way Hyper Parameters Are Logged Under Configuration, The Values That Are Stored Seem To Add Unnecessary Escape Characters To The Original Values.. Is It A Known Issue? Is There A Way To Change It? Thanks

Hmm DepressedChimpanzee34 my bad it seems the loading is done via YAML loader, but the dumping is straight forward str casting...
https://github.com/allegroai/clearml/blob/6e6271fb91f2aeb2aa7a13c6d07d4e635baaa670/clearml/backend_interface/task/task.py#L934
What would you expect to get (BTW "value\blah" is Not a valid string assignment in python as there is no \b escape character, it should be "value\blah" which translates into the text "value\blah")

3 years ago
0 Where Is The Stdout Of

Usually in the /tmp folder under a temp filename (it is generated automatically when spinned)
In case of the services, this will be inside the docker itself

3 years ago
0 Is There A Way To Access Dataframe Logged Using Report_Table From The A Task Instance Instantiated Using Task.Get_Task(Id='.....')? I Have: T = Task.Get_Task(Id='....') And I Am Looking For Something Along The Lines Of: Df = T.Get_Table('Table Name')

You should have a download button when you hover over the table, I guess that would be the easiest.
If needed I can send an SDK code but unfortunately there is no single call for that

3 years ago
0 Can Anyone Help Me About This?

Hi @<1523708901155934208:profile|SubstantialBaldeagle49>
If you report on the same iteration with the same title/series you are essentially overwriting the data (as expected)
Regrading the plotly report size.
Two options:

  • round down numbers (by default it will store all the digits, and usually after the forth it's quite useless, and it will drastically decrease the plot size)
  • Use logger.report_scatter2d , it is more efficient and has a mechanism to subsample extremely large graphs.
4 years ago
0 Hello! I Don'T Know If It Is The Right Place To Ask About It But Maybe Someone Else Has Faced The Same Problem I Created Task "My_Task" From Branch "My_Branch" With "My_Commit_Id" Then I Merged "Another_Brach" Into "Master" After Merging Clearml-Agent Can

Hi RobustHippopotamus53
The way "latest from branch" works:
On the Task you specify the branch name (e.g. "master", no need to add the origin/ prefix) The agent then pulls the latest commit from that branch and updates back the Task to the current commit ID (the latest on the branch at the time of execution) This process ensures reproduciblity and traceability as we can always be certain the exact commit that was executed.Could it be the you "forced-push" a commit/squash, hence the "origina...

3 years ago
0 I Wanted To Ask About K8S + Clearml-Agent Integration. Details In The Thread.

K8s + clearml-agent integration.

Hmm is this an on-prem k8s cluster?

2 years ago
0 Hi

No sure I follow, you mean to launch it on the kubernretes cluster from the ClearML UI?
(like the clearml-k8s-glue ?)

2 years ago
0 Hi. Is There A Way To Make Hyperparameters/Any Part Form Become A Dropdown List When In Draft Mode On Clearml Ui? Like We Want Set Using Ui But Limited Option On Dropdown List.

Hmm, so currently you can provide help, so users know what they can choose from, but there is no way to limit it.
I know the Enterprise version has something similar that allows users to create a custom "application" from a Task, there you can define a drop and as such, but that might be an overkill here, wdyt?

one year ago
0 I Am Starting To Use Clearml-Data, And I Have A Feature Request - Add A Progress Bar For The Upload Phase / Log Which Files Are Uploaded / Add Upload Speed Currently When Uploading Large Amounts Of Data, We Can An Obscure Message Of

The issue is uploading reporting fro http uploads (object storage will report upload). Basically the http upload is post with urllib that does not support upload callbacks for progress report. If you have an idea here, we will gladly add it (as you mentioned it can be quite annoying to have to open network manager to verify the upload is progressing)

3 years ago
0 Hello. I'M Interested In Dynamic Gpu Feature. But I Can'T Find Any Information On How It Works. Can You Help Me With It? Is It Possible To Try It Somewhere ?

ItchyJellyfish73
Unfortunately this needs backend support, and only available in the enterprise version, what is your use case for it? (It was designed to allow out of the box bare-metal multi gpu dynamic allocation, think DGX with 8 GPUs that instead of spinning down agents when you want to change the queue->num-gpu mapping you can do it on the fly)

3 years ago
0 I Have Setup A

Q. Would someone mind outlining what the steps are to configuring the default storage locations, such that any artefacts or data which are pushed to the server are stored by default on the Azure Blob Store?

Hi VivaciousPenguin66
See my reply here on configuring the default output uri on the agent: https://clearml.slack.com/archives/CTK20V944/p1621603564139700?thread_ts=1621600028.135500&cid=CTK20V944
Regrading permission setup:
You need to make sure you have the Azure blob credenti...

3 years ago
0 I Have A Little Bit Of Code That Goes Like:

Are you seeing the argparse arguments in the UI (when running locally) ?

3 years ago
0 I Have Setup A

I assume the account name and key refers to the storage account credentials that you can from Azure Storage Explorer?

correct

3 years ago
0 I Have A Little Bit Of Code That Goes Like:

ElegantCoyote26
parser = get_parser() args_ = vars(parser.parse_args()) task.connect(args_)There is no need to connect args_ Task.init will automatically catch the argparser.

3 years ago
0 Hello All, We’Re Trying To Use

Is there still an issue? Could it be the browser cannot access the file server directly?

one year ago
0 Hello There! I Was Trying To Update The Url For Debug Samples After Migration Of The Server To A New Domain And Was Following The Steps From Here:

Hi @<1684010629741940736:profile|NonsensicalSparrow35>

But the provided command is missing the url target for the curl so it is not complete.

Not sure I followed. did you specify "NEW_ADDRESS" ?
or is it the in both cases the URL is locahost ?

5 months ago
0 Hi Community! I Have Difficulty Using Clearml Pipeline. I Am Writing The Code Using The Pipeline Decorator, But The Pipeline Does Not Work With The Following Error When Specifying The Docker Image As A Argument Of The Decorator. How Should I Solve It?

Thanks @<1634001106403069952:profile|DefeatedMole42>
A follow up, (1) how are you spinning the agent ? (2) could it be the docker image "ultralytics/yolov5" does not have Bash as entry point ?
you can force that with

@PipelineDecorator.component(return_values=['int'], cache=False,
                             task_type='training',
                             docker="ultralytics/yolov5",
                             docker_args="--entrypoint /bin/bash",
                             pa...
10 months ago
0 Hi Everyone, I'M Running Into A Weird Error When Trying To Clone And Run And Task That Has Completed Successfully. I Have A Test Task That Loads A Dummy Dataset And Trains A Toy Model With Pytorch. When Running Remotely, I Use My Own Docker Image That Has

@<1533620191232004096:profile|NuttyLobster9> I think we found the issue, when you are passing a direct link to the python venv, the agent fails to detect the python version and since the python version is required for fetching the correct torch it fails to install it. This is why passing CLEARML_AGENT_PACKAGE_PYTORCH_RESOLVE=none because it skipped resolving the torch / cuda version (that requires parsing the python version)

5 months ago
5 months ago
0 Hello, I Am Trying To Run Some Algorithm In My Docker Container With Clearml Task . But The Algorithm Uses Ros, So I Need Somehow To Setup Environment Before Run It And Launch
  • but the

pytorch/main.py

file doesn't run.

What do you have on the Task itself? is this the correct script ?
Any chance you can send a full log ? (you can DM it if it helps)

one year ago
Show more results compactanswers