Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
SlipperySheep79
Moderator
16 Questions, 35 Answers
  Active since 19 May 2023
  Last activity one month ago

Reputation

0

Badges 1

35 × Eureka!
0 Votes
2 Answers
519 Views
0 Votes 2 Answers 519 Views
When I run a clearml_agent in docker mode, it automatically mounts the local agent’s config into the docker image as volume: in the logs I see these options ...
8 months ago
0 Votes
11 Answers
490 Views
0 Votes 11 Answers 490 Views
Hi, I have an issue when running a pipeline controller remotely in docker. Basically I have a module that reads a config file into a dict and calls the pipel...
9 months ago
0 Votes
4 Answers
497 Views
0 Votes 4 Answers 497 Views
Hi, is there a way to query some tasks ordered by a scalar metric? I tried with: Task.get_tasks( project_name='project', task_name='task', task_filter={'orde...
10 months ago
0 Votes
5 Answers
472 Views
0 Votes 5 Answers 472 Views
10 months ago
0 Votes
10 Answers
131 Views
0 Votes 10 Answers 131 Views
Hi everyone! Is there a way to specify the working directory in a pipeline component? I’m using pipelines from decorators, I can set the repo url just fine, ...
2 months ago
0 Votes
2 Answers
122 Views
0 Votes 2 Answers 122 Views
one month ago
0 Votes
0 Answers
416 Views
0 Votes 0 Answers 416 Views
Hey all, any suggestion for this issue?
9 months ago
0 Votes
3 Answers
473 Views
0 Votes 3 Answers 473 Views
Is it possible to get the list of running agents in my machine? I’m starting multiple agents using python -m clearml_agent daemon --queue XX --detach , but i...
8 months ago
0 Votes
5 Answers
428 Views
0 Votes 5 Answers 428 Views
Hi all, I was trying to reduce the amount of logs shown in the cosnole produced by tqdm, so I set console_cr_flush_period: 30 . It works correctly when I run...
8 months ago
0 Votes
2 Answers
476 Views
0 Votes 2 Answers 476 Views
Hi all, is it possible to define some kind of conditional retry logic when using pipelines from decorator? I’m using the retry_on_failure option on the @Pipe...
8 months ago
0 Votes
4 Answers
538 Views
0 Votes 4 Answers 538 Views
9 months ago
0 Votes
2 Answers
512 Views
0 Votes 2 Answers 512 Views
Hi! I'm registering a pandas dataframe using register_artifact . I can see the preview under the ARTIFACTS tab, but when I try to download the csv file from ...
9 months ago
0 Votes
2 Answers
486 Views
0 Votes 2 Answers 486 Views
Are nested pipeline component supported? e.g. having a component call other pipeline components? I tried to run this snippet: from clearml import PipelineDec...
9 months ago
0 Votes
3 Answers
541 Views
0 Votes 3 Answers 541 Views
10 months ago
0 Votes
16 Answers
549 Views
0 Votes 16 Answers 549 Views
Hi, is there a way to get the quota used by each task? My "metrics" quota is filling up very quickly and I would like to understand what's causing it.
10 months ago
0 Votes
4 Answers
551 Views
0 Votes 4 Answers 551 Views
Is there a way to tag a task/dataset "folder" with the SDK? In the UI there is an option to add a tag to a certain project folder, but in python it seems tha...
9 months ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

Also: what's the purpose of storing the pipeline arguments as artifacts then? When it runs remotely it still runs the main script as entrypoint and not the pipeline function directly, so all the arguments will be replaced by what is passed to the function during the remote execution, right?

9 months ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

Hi @<1523701435869433856:profile|SmugDolphin23> , I just tried it but Task.current_task() returns None even when running in remotely

9 months ago
0 Another Question About Pipelines: Is It Possible To Run A Component On The Same Machine Of The Main Pipeline Controller? I Have A Function That It'S Rather Fast To Execute, So I Don'T Want To Start A Separate Container Just For It, But I'D Like To Track I

What is not clear to me is how you would use the callbacks to run the step locally. Are there some properties that needs to be set in the task? I see that there is a start_controller_locally option for the main @PipelineDecorator.pipeline , but I don't see it for @PipelineDecorator.component

9 months ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

Hi @<1523701087100473344:profile|SuccessfulKoala55> , I'm uploading some debug images by they are around 300KB each, and less than 10 per experiment. Also, aren't debug images counted as artifacts for the quota?

10 months ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

I have some git diffs logged but they are very small. For the configurations I saw that the datasets tasks have a fairly large "Dataset Content" config (~2MB), but I only have 5 dataset tasks

10 months ago
0 Is There A Way To Specify The Project And Name Of A Pipeline Defined With Decorators At Runtime? I Tried To Change The Properties Of The Current Task, The Name Is Updated Correctly But When I Try To Move The Task To Another Project It Disappears From The

Yes these are the only actions. The task is moved correctly tho, I can see it under f'{config.project_id}/.pipelines' in the UI, the issue is that it's not visible under PIPELINES . I haven't tried with tasks or fiunctions pipelines yet.

10 months ago
0 Is There A Way To Tag A Task/Dataset "Folder" With The Sdk? In The Ui There Is An Option To Add A Tag To A Certain Project Folder, But In Python It Seems That It Can Be Done Only On Single Task Runs.

Hi @<1523701087100473344:profile|SuccessfulKoala55> , thanks, and how can I get the "id" to use with update for the dataset folder case?

9 months ago
8 months ago
0 Hi Everyone! Is There A Way To Specify The Working Directory In A Pipeline Component? I’M Using Pipelines From Decorators, I Can Set The Repo Url Just Fine, But I’M Running Everything From A Subfolder, And The Working Dir Is Set To

Hi @<1523701205467926528:profile|AgitatedDove14> , in my case all the code is in a subfolder, like projects/main , so if I run from the git root it can’t find the local modules

2 months ago
0 Hi Everyone! Is There A Way To Specify The Working Directory In A Pipeline Component? I’M Using Pipelines From Decorators, I Can Set The Repo Url Just Fine, But I’M Running Everything From A Subfolder, And The Working Dir Is Set To

In the meantime, any suggestion on how to set the working_dir in any other way? We are moving to this new code structure and I’d like to have clearml up and running

2 months ago
0 Hi Everyone! Is There A Way To Specify The Working Directory In A Pipeline Component? I’M Using Pipelines From Decorators, I Can Set The Repo Url Just Fine, But I’M Running Everything From A Subfolder, And The Working Dir Is Set To

This would work to load the local modules, but I’m also using poetry and the pyproject.toml is in the subdirectory, so the agent won’t install any dependency if I don’t set the work_dir

one month ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

Hi @<1523701205467926528:profile|AgitatedDove14> , I already tried to check manually in the web UI for some anomalous file, i.e. by downloading the log files or exporting the metrics plots, but I couldn't find anything that takes more than 100KB, and I'm already at 300MB of usage with just 15 tasks. It's not possible to get more info using some python APIs?

10 months ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

Hi @<1523701087100473344:profile|SuccessfulKoala55> , I think the issue is where to put the connect_configuration call. I can't put it inside run_pipeline because it's only running remotely and it doesn't have access to the file, and I can't put it in the script before the call to run_pipeline since the task has not been initialized yet.

9 months ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

So the longest experiments I have takes ~800KB in logs. I have tens of plotly plots logged manually, how are they stored internally? I tried to export them to json and they don't take more than 50KB each, but maybe they take more memory internally?

10 months ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

For instance, I have in my_pipeline/__main__.py :

import yaml
import argparse
from my_pipeline.pipeline import run_pipeline

parser = argparse.ArgumentParser()
parser.add_argument('--config', type=str, required=True)

if __name__ == '__main__':
    args = parser.parse_args()
    with open(args.config) as f:
        config = yaml.load(f, yaml.FullLoader)
    run_pipeline(config)

and in my_pipeline/pipeline.py :

@PipelineDecorator.pipeline(
    name='Main',
    project=...
9 months ago
0 Hi! I'M Registering A Pandas Dataframe Using

Yes I can read it using this. I was just wondering if there is a way to read the file downloaded directly from the UI

9 months ago
0 Hi, I Have An Issue When Running A Pipeline Controller Remotely In Docker. Basically I Have A Module That Reads A Config File Into A Dict And Calls The Pipeline Controller, Like

@<1523701435869433856:profile|SmugDolphin23> then the issue is that config is not set. I also tried with:

import yaml
import argparse
from my_pipeline.pipeline import run_pipeline
from clearml import Task

parser = argparse.ArgumentParser()
parser.add_argument('--config', type=str, required=True)

if __name__ == '__main__':
    if Task.running_locally()::
      args = parser.parse_args()
      with open(args.config) as f:
          config = yaml.load(f, yaml.FullLoader)
    else:
     ...
9 months ago
0 Is There A Way To Specify The Project And Name Of A Pipeline Defined With Decorators At Runtime? I Tried To Change The Properties Of The Current Task, The Name Is Updated Correctly But When I Try To Move The Task To Another Project It Disappears From The

I think I found a solution using pipeline_task.move_to_project(new_project_name=f'{config.project_id}/.pipelines/{config.run_name}', system_tags =['hidden', 'pipeline'])

10 months ago
0 Hi Everyone! I Have A Question Regarding A Specific Use Case For Tasks. To Run Hyperparam Optimization I Have A Function That Evaluates A Model On A Bunch Of Videos And Outputs A Metric. I Would Like To Log Somewhere The Results, So That I Can Then Easil

So the issue is that I would like too keep the list of hyperparams and metrics, if I clean them up then I would lose them. But I agree that I might be overthinking it

10 months ago
0 Hi All, I Was Trying To Reduce The Amount Of Logs Shown In The Cosnole Produced By Tqdm, So I Set

I just have some for loop in some pipeline components, when processing some files. I know it increases the flush intervals and it’s working when run locally, I only see a new line from tqdm every ~30s. It’s just when I run the same script in docker using the agent I get a new line every ~5s

8 months ago
0 Is It Possible To Get The List Of Running Agents In My Machine? I’M Starting Multiple Agents Using

Hi @<1523701070390366208:profile|CostlyOstrich36> , thanks but in this case I’d like to get also the ids of the running workers, so that I can selectively stop some of them. Is it possible somehow?

8 months ago
0 Hi Everyone! Is There A Way To Force An Agent To Clear The Clearml Cache Folder Every Time It’S Done With A Task? I Have A Machine Running Multiple Agents For Training, But The Disk Space Gets Filled Up Quickly Since Every New Task Uses A New Dataset. I S

Hi @<1523701070390366208:profile|CostlyOstrich36> , yes it's specifically with datasets. Probably the option I need is size.max_used_bytes but it looks like it's available only for the enterprise plan? Is there any other way to clean the cache after each task?

one month ago
0 Hi, Is There A Way To Get The Quota Used By Each Task? My "Metrics" Quota Is Filling Up Very Quickly And I Would Like To Understand What'S Causing It.

Would just having some python API be an option? It would be more than enough to check what is causing this, and it would be called infrequently

10 months ago
Show more results compactanswers