Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SlipperySheep79
Moderator
16 Questions, 35 Answers
  Active since 19 May 2023
  Last activity 5 months ago

Reputation

0

Badges 1

35 × Eureka!
0 Votes
5 Answers
560 Views
0 Votes 5 Answers 560 Views
Hi all, I was trying to reduce the amount of logs shown in the cosnole produced by tqdm, so I set console_cr_flush_period: 30 . It works correctly when I run...
11 months ago
0 Votes
16 Answers
727 Views
0 Votes 16 Answers 727 Views
Hi, is there a way to get the quota used by each task? My "metrics" quota is filling up very quickly and I would like to understand what's causing it.
one year ago
0 Votes
3 Answers
704 Views
0 Votes 3 Answers 704 Views
one year ago
0 Votes
3 Answers
645 Views
0 Votes 3 Answers 645 Views
Is it possible to get the list of running agents in my machine? I’m starting multiple agents using python -m clearml_agent daemon --queue XX --detach , but i...
11 months ago
0 Votes
4 Answers
636 Views
0 Votes 4 Answers 636 Views
Hi, is there a way to query some tasks ordered by a scalar metric? I tried with: Task.get_tasks( project_name='project', task_name='task', task_filter={'orde...
one year ago
0 Votes
5 Answers
616 Views
0 Votes 5 Answers 616 Views
one year ago
0 Votes
0 Answers
544 Views
0 Votes 0 Answers 544 Views
Hey all, any suggestion for this issue?
one year ago
0 Votes
2 Answers
682 Views
0 Votes 2 Answers 682 Views
Hi! I'm registering a pandas dataframe using register_artifact . I can see the preview under the ARTIFACTS tab, but when I try to download the csv file from ...
one year ago
0 Votes
2 Answers
283 Views
0 Votes 2 Answers 283 Views
5 months ago
0 Votes
2 Answers
693 Views
0 Votes 2 Answers 693 Views
When I run a clearml_agent in docker mode, it automatically mounts the local agent’s config into the docker image as volume: in the logs I see these options ...
11 months ago
0 Votes
10 Answers
303 Views
0 Votes 10 Answers 303 Views
Hi everyone! Is there a way to specify the working directory in a pipeline component? I’m using pipelines from decorators, I can set the repo url just fine, ...
5 months ago
0 Votes
4 Answers
721 Views
0 Votes 4 Answers 721 Views
Is there a way to tag a task/dataset "folder" with the SDK? In the UI there is an option to add a tag to a certain project folder, but in python it seems tha...
one year ago
0 Votes
4 Answers
695 Views
0 Votes 4 Answers 695 Views
one year ago
0 Votes
11 Answers
655 Views
0 Votes 11 Answers 655 Views
Hi, I have an issue when running a pipeline controller remotely in docker. Basically I have a module that reads a config file into a dict and calls the pipel...
one year ago
0 Votes
2 Answers
638 Views
0 Votes 2 Answers 638 Views
Are nested pipeline component supported? e.g. having a component call other pipeline components? I tried to run this snippet: from clearml import PipelineDec...
one year ago
0 Votes
2 Answers
638 Views
0 Votes 2 Answers 638 Views
Hi all, is it possible to define some kind of conditional retry logic when using pipelines from decorator? I’m using the retry_on_failure option on the @Pipe...
12 months ago
0 Hi Everyone! Is There A Way To Specify The Working Directory In A Pipeline Component? I’M Using Pipelines From Decorators, I Can Set The Repo Url Just Fine, But I’M Running Everything From A Subfolder, And The Working Dir Is Set To

In the meantime, any suggestion on how to set the working_dir in any other way? We are moving to this new code structure and I’d like to have clearml up and running

5 months ago
0 Hi, Is There A Way To Query Some Tasks Ordered By A Scalar Metric? I Tried With:

Hi @<1523701087100473344:profile|SuccessfulKoala55> , thanks for the answer, I'll try that. Would you suggest any other simpler way to achieve the same result? I just want to get the best model according to a logged metric.

one year ago
0 Hi Everyone! Is There A Way To Specify The Working Directory In A Pipeline Component? I’M Using Pipelines From Decorators, I Can Set The Repo Url Just Fine, But I’M Running Everything From A Subfolder, And The Working Dir Is Set To

Hi @<1523701205467926528:profile|AgitatedDove14> , in my case all the code is in a subfolder, like projects/main , so if I run from the git root it can’t find the local modules

5 months ago
0 Is It Possible To Get The List Of Running Agents In My Machine? I’M Starting Multiple Agents Using

Hi @<1523701070390366208:profile|CostlyOstrich36> , thanks but in this case I’d like to get also the ids of the running workers, so that I can selectively stop some of them. Is it possible somehow?

11 months ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

Hi @<1523701087100473344:profile|SuccessfulKoala55> , I'm uploading some debug images by they are around 300KB each, and less than 10 per experiment. Also, aren't debug images counted as artifacts for the quota?

one year ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

I have some git diffs logged but they are very small. For the configurations I saw that the datasets tasks have a fairly large "Dataset Content" config (~2MB), but I only have 5 dataset tasks

one year ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

Hi @<1523701205467926528:profile|AgitatedDove14> , I already tried to check manually in the web UI for some anomalous file, i.e. by downloading the log files or exporting the metrics plots, but I couldn't find anything that takes more than 100KB, and I'm already at 300MB of usage with just 15 tasks. It's not possible to get more info using some python APIs?

one year ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

I deleted a few experiments, but they had the same kind of plots and metrics. so I don't think they would release much space

one year ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

So the longest experiments I have takes ~800KB in logs. I have tens of plotly plots logged manually, how are they stored internally? I tried to export them to json and they don't take more than 50KB each, but maybe they take more memory internally?

one year ago
0 Are Nested Pipeline Component Supported? E.G. Having A Component Call Other Pipeline Components? I Tried To Run This Snippet:

Basically I want to run a function in parallel, and having that function create multiple tasks. So I was thinking of setting up a pipeline to have this hierarchy main -> parallelized_function -> init_task_function . But I guess I could also just call Task.create in init_task_function and achieve the same

one year ago
0 Hi Everyone! Is There A Way To Specify The Working Directory In A Pipeline Component? I’M Using Pipelines From Decorators, I Can Set The Repo Url Just Fine, But I’M Running Everything From A Subfolder, And The Working Dir Is Set To

This would work to load the local modules, but I’m also using poetry and the pyproject.toml is in the subdirectory, so the agent won’t install any dependency if I don’t set the work_dir

5 months ago
0 Another Question About Pipelines: Is It Possible To Run A Component On The Same Machine Of The Main Pipeline Controller? I Have A Function That It'S Rather Fast To Execute, So I Don'T Want To Start A Separate Container Just For It, But I'D Like To Track I

What is not clear to me is how you would use the callbacks to run the step locally. Are there some properties that needs to be set in the task? I see that there is a start_controller_locally option for the main @PipelineDecorator.pipeline , but I don't see it for @PipelineDecorator.component

one year ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

@<1523701435869433856:profile|SmugDolphin23> then the issue is that config is not set. I also tried with:

import yaml
import argparse
from my_pipeline.pipeline import run_pipeline
from clearml import Task

parser = argparse.ArgumentParser()
parser.add_argument('--config', type=str, required=True)

if __name__ == '__main__':
    if Task.running_locally()::
      args = parser.parse_args()
      with open(args.config) as f:
          config = yaml.load(f, yaml.FullLoader)
    else:
     ...
one year ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

Also: what's the purpose of storing the pipeline arguments as artifacts then? When it runs remotely it still runs the main script as entrypoint and not the pipeline function directly, so all the arguments will be replaced by what is passed to the function during the remote execution, right?

one year ago
0 Is There A Way To Specify The Project And Name Of A Pipeline Defined With Decorators At Runtime? I Tried To Change The Properties Of The Current Task, The Name Is Updated Correctly But When I Try To Move The Task To Another Project It Disappears From The

I think I found a solution using pipeline_task.move_to_project(new_project_name=f'{config.project_id}/.pipelines/{config.run_name}', system_tags =['hidden', 'pipeline'])

one year ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

Hi @<1523701087100473344:profile|SuccessfulKoala55> , I think the issue is where to put the connect_configuration call. I can't put it inside run_pipeline because it's only running remotely and it doesn't have access to the file, and I can't put it in the script before the call to run_pipeline since the task has not been initialized yet.

one year ago
0 Hi Everyone! I Have A Question Regarding A Specific Use Case For Tasks. To Run Hyperparam Optimization I Have A Function That Evaluates A Model On A Bunch Of Videos And Outputs A Metric. I Would Like To Log Somewhere The Results, So That I Can Then Easil

So the issue is that I would like too keep the list of hyperparams and metrics, if I clean them up then I would lose them. But I agree that I might be overthinking it

one year ago
0 Is There A Way To Tag A Task/Dataset "Folder" With The Sdk? In The Ui There Is An Option To Add A Tag To A Certain Project Folder, But In Python It Seems That It Can Be Done Only On Single Task Runs.

Hi @<1523701087100473344:profile|SuccessfulKoala55> , thanks, and how can I get the "id" to use with update for the dataset folder case?

one year ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

Would just having some python API be an option? It would be more than enough to check what is causing this, and it would be called infrequently

one year ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

For instance, I have in my_pipeline/__main__.py :

import yaml
import argparse
from my_pipeline.pipeline import run_pipeline

parser = argparse.ArgumentParser()
parser.add_argument('--config', type=str, required=True)

if __name__ == '__main__':
    args = parser.parse_args()
    with open(args.config) as f:
        config = yaml.load(f, yaml.FullLoader)
    run_pipeline(config)

and in my_pipeline/pipeline.py :

@PipelineDecorator.pipeline(
    name='Main',
    project=...
one year ago
0 Is There A Way To Specify The Project And Name Of A Pipeline Defined With Decorators At Runtime? I Tried To Change The Properties Of The Current Task, The Name Is Updated Correctly But When I Try To Move The Task To Another Project It Disappears From The

Yes these are the only actions. The task is moved correctly tho, I can see it under f'{config.project_id}/.pipelines' in the UI, the issue is that it's not visible under PIPELINES . I haven't tried with tasks or fiunctions pipelines yet.

one year ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

Hi @<1523701435869433856:profile|SmugDolphin23> , I just tried it but Task.current_task() returns None even when running in remotely

one year ago
0 Hi All, I Was Trying To Reduce The Amount Of Logs Shown In The Cosnole Produced By Tqdm, So I Set

I just have some for loop in some pipeline components, when processing some files. I know it increases the flush intervals and it’s working when run locally, I only see a new line from tqdm every ~30s. It’s just when I run the same script in docker using the agent I get a new line every ~5s

11 months ago
Show more results compactanswers