Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
FierceHamster54
Moderator
29 Questions, 178 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

131 × Eureka!
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
Hello everyone! I setup a GCP autoscaler on my Pro SaaS deployment but I keep getting this error: clearml_agent: ERROR: Server does not support --use-owner-t...
2 years ago
0 Votes
25 Answers
1K Views
0 Votes 25 Answers 1K Views
Hey, trying to figure out how to create an https://clear.ml/docs/latest/docs/clearml_sdk/model_sdk#output-models , the doc says it needs a TaskId but my trai...
2 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
Hey is ClearML MLFlow based ? Is it exposed to CVE-2023-1176 and CVE-2023-1177 ?
one year ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
2 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
Hey is .get_local_copy() thread-safe ? I mean can I concurrently download several datasets without breaking the cache and StorageManager ?
2 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
Hey, it is said in the pipeline decorator example that requirements for executing a pipeline component is inferred from the imports inside the component func...
2 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
2 years ago
0 Votes
20 Answers
1K Views
0 Votes 20 Answers 1K Views
Hey currently trying to run a pipeline locally to test a pipeline component with PipelineDecorator.run_locally() , first try returned a random pandas error, ...
2 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
Hey has anyone managed to capture Darts logging with ClearML when using the temporal fusion transformers ? Even when overriding their trainer with a custom P...
one year ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hey I hope everyone is having a good day, two quick questions about datasets: Does squashing two datasets deletes the two original datasets ? Is it possible ...
2 years ago
0 Votes
4 Answers
1K Views
0 Votes 4 Answers 1K Views
2 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
Heya, good day to everyone, I'm finding myself facing this random error with a very opaque backtrace when attempting to squash two distinct versions of the s...
2 years ago
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
Hey just wanting to know: what is the recommended best practice to write ClearML Pipelines between controller and decorators ?
2 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
2 years ago
0 Votes
8 Answers
842 Views
0 Votes 8 Answers 842 Views
Hey everyone, I am having some difficulties passing environment variables on my piepeline components running on agents ( 1.6.1 ) without docker mode: - I set...
one year ago
0 Votes
5 Answers
1K Views
0 Votes 5 Answers 1K Views
Hey, would it be possible to add a way to edit autoscaler configs without having to clone them ? This is really frustrating especially when you reached the q...
2 years ago
0 Votes
4 Answers
1K Views
0 Votes 4 Answers 1K Views
2 years ago
0 Votes
6 Answers
932 Views
0 Votes 6 Answers 932 Views
one year ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
2 years ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
Hey having an issue passing parameters to a component in a pipeline, the parameters appear to be None inside the component function: @PipelineDecorator.compo...
2 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
Hey, is there a shortcut on the Dataset SDK to directly get the latest version of a dataset ?
2 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hey, is there a way to pass docker args for the execution of a pipeline controller defined through decorator @PipelineDecorator.pipeline the same way we can ...
2 years ago
0 Votes
13 Answers
2K Views
0 Votes 13 Answers 2K Views
Hey all, hope you're having a great day, having an unexpected behavior with a training task of a YOLOv5 model on my pipeline, I specified a task in my traini...
2 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Heya, is there any plan for ClearML to leverage the new https://developer.nvidia.com/blog/getting-the-most-out-of-the-a100-gpu-with-multi-instance-gpu/ tech ...
2 years ago
0 Votes
7 Answers
946 Views
0 Votes 7 Answers 946 Views
Hey everyone, As a Pro-tier SaaS user, I'm experiencing a very high latency when finalizing a dataset, it is attached in a big dataset version hierarchy and ...
one year ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
Heya, I hope you're all well in this beautiful day, my GCP Autoscaler just died with that strange but short backtrace, wondered if it rang a bell to any of y...
2 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Hey guys, i hope you all have a nice days, I had to use the Task method .setup_aws_upload(bucket=...,region=...) to overcome a incorrect region specified for...
2 years ago
0 Votes
4 Answers
1K Views
0 Votes 4 Answers 1K Views
2 years ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
Heya, trying to setup a GCP autoscaler for general-purpose CPU instances ( e2-standard-4 ) but I get into that error: googleapiclient.errors.HttpError: And I...
2 years ago
0 Hey Just Wanting To Know: What Is The Recommended Best Practice To Write Clearml Pipelines Between Controller And Decorators ?

As opposed to the Controller/Task component where the add_step() only allows to sequentially execute them

2 years ago
0 Hey, Is There A Way To Pass Docker Args For The Execution Of A Pipeline Controller Defined Through Decorator

Takling about that decorator which shouyld also have a docker_arg param since it is executed as an "orchestration component" but the param is missing https://clear.ml/docs/latest/docs/references/sdk/automation_controller_pipelinecontroller/#pipelinedecoratorpipeline

2 years ago
0 Hey Just Wanting To Know: What Is The Recommended Best Practice To Write Clearml Pipelines Between Controller And Decorators ?

Btw AgitatedDove14 is there a way to define parallel tasks and use pipeline as an acyclic compute graph instead of simply sequential tasks ?

2 years ago
0 Hey Currently Trying To Run A Pipeline Locally To Test A Pipeline Component With

Ia lready deleted ~/.clearml/cache but I'll try deleting the entire folder

2 years ago
0 Hey, Trying To Figure Out How To Create An

My bad, the specified file did not exists since I forgot to raise an exception if the export command failed >< Well I guess this is the reason, will test that on monday

2 years ago
0 Hi, Is There Any Way To Upload Data To A Clearml Dataset Without Compression At All? I Have Very Small Text Files That Make Up A Dataset And Compression Seems To Take Most Of The Upload Time And It Provide Almost No Benefits W.R.T Size

The default compression parameter value is ZIP_MINIMAL_COMPRESSION , I guess you could try to check if there is a Tarball only option but anyway most of the CPU time took by the upload process is the generation of the hashes of the file entries

2 years ago
0 Hey Just Wanting To Know: What Is The Recommended Best Practice To Write Clearml Pipelines Between Controller And Decorators ?

Sure but the same pattern can be achieved using explicitly the PipelineController class and defining steps using .add_step() pointing to CLearML's Task objects right ?

The decorators simply abstract away the controller but both methods (decorators or controller/tasks) allows to decouple your pipelines in steps each having an independent compute target, right ?

So basically choosing one method or the other only a question of best-practice or style ?

2 years ago
0 Hey Having An Issue Passing Parameters To A Component In A Pipeline, The Parameters Appear To Be

Hey SuccessfulKoala55 currently using the clearml package version 1.7.1 and my server is a PRO SaaS deployment

2 years ago
0 Hey Having An Issue Passing Parameters To A Component In A Pipeline, The Parameters Appear To Be

Okay I confirm having default parameters fixes that issue, but kinda sad to have lost 3 days into that super weird behavior

2 years ago
0 Hi Folks! Is There A Way To Programmatically Create Users And Groups On A Server? If So, Can Someone Point Me To The Docs? Thanks

It doesnt seem so if you look at the REST api documentation, might be available as an ENterprise plan feature

one year ago
0 Hey, Is There A Shortcut On The Dataset Sdk To Directly Get The Latest Version Of A Dataset ?

I would like instead of having to:
Fetch latest dataset to get the current latest version Increment the version number Create and upload a new version of the datasetTo be able to:
Select a dataset project by name Create a new version of the dataset by choosing what increment in SEMVER standard I would like to add for this version number (major/minor/patch) and upload

2 years ago
0 Hello Everyone! I Setup A Gcp Autoscaler On My Pro Saas Deployment But I Keep Getting This Error:

Okay thanks! Please keep me posted when the hotfix is out on the SaaS

2 years ago
0 Hey, About The Dependency Propagation Of Pipeline Components, If I Call A Vanilla Python Function From A Component Does The Dependencies Specified In The Internal Imports Propagated To This Function Call Too ? And Additionally If That Function Is In Anoth

Okay looks like the call dependency resolver does not supports cross-file calls and relies instead on the local repo cloning feature to handle multiple files so the Task.force_store_standalone_script() does not allow for a pipeline defined cross multiple files (now that you think of it it was kinda implied by the name), but what is interesting is that calling an auxiliary function in the SAME file from a component also raise a NameError: <function_name> is not defined , that's ki...

2 years ago
0 Heya, I Hope You'Re All Well In This Beautiful Day, My Gcp Autoscaler Just Died With That Strange But Short Backtrace, Wondered If It Rang A Bell To Any Of You ?

Another crash on the same autoscaler instance:
`
2022-11-04 15:53:54
2022-11-04 14:53:50,393 - usage_reporter - INFO - Sending usage report for 60 usage seconds, 1 units
2022-11-04 14:53:51,092 - clearml.Auto-Scaler - INFO - 2415066998557416558 console log:
Nov 4 14:53:29 clearml-worker-9357f6985dcc4f3c9d44b32a9ac2e09b systemd[1]: var-lib-docker-overlay2-b04bca4c99cf94c31a3644236d70727aaa417fa4122e1b6c012e0ad908af24ef\x2dinit-merged.mount: Deactivated successfully.
Nov 4 14:53:29 clearml-w...

2 years ago
0 Heya, I Hope You'Re All Well In This Beautiful Day, My Gcp Autoscaler Just Died With That Strange But Short Backtrace, Wondered If It Rang A Bell To Any Of You ?

Hey CostlyOstrich36 I got another occurence of autoscaler crash with a similar backtrace, any updates on this issue?
`
2022-11-04 11:46:55
2022-11-04 10:46:51,644 - clearml.Auto-Scaler - INFO - 5839398111025911016 console log:
Starting Cleanup of Temporary Directories...
Nov 4 10:46:46 clearml-worker-deb01e0837bb4b00865e4e72c90586c4 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 4 10:46:46 clearml-worker-deb01e0837bb4b00865e4e72c90586c4 systemd[1]: systemd-tmpfiles...

2 years ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

I had the same issues too on some of my components and I had to specify them in the packages=["package-1", "package-2", ...] in my @PipelineDecorator.component() decorator parameters

2 years ago
0 Hey Currently Trying To Run A Pipeline Locally To Test A Pipeline Component With

So it seems to be an issue with the component parameter called in:
` @PipelineDecorator.pipeline(
name="VINZ Auto-Retrain",
project="VINZ",
version="0.0.1",
pipeline_execution_queue="Quad_VCPU_16GB"
)
def executing_pipeline(start_date, end_date):
print("Starting VINZ Auto-Retrain pipeline...")
print(f"Start date: {start_date}")
print(f"End date: {end_date}")

window_dataset_id = generate_dataset(start_date, end_date)

if name == 'main':
PipelineDec...

2 years ago
0 <no title>

Did you properly install Docker and Docker nvidia toolkit ? here's the init script i'm using on my autoscaled workers:

#!/bin/sh

sudo apt-get update -y

sudo apt-get install -y \
    ca-certificates \
    curl \
    gnupg \
    lsb-release

sudo mkdir -p /etc/apt/keyrings

curl -fsSL 
 | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg

echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] 
 \
  $(lsb_release -cs) stable" | s...
one year ago
0 Hey, Trying To Figure Out How To Create An

Well I uploaded datasets in the previous steps with the same credentials

2 years ago
0 Hey, Is There A Way To Set Pipeline Component Return Artifact Compression At A Pipeline Level ? It Would Allow To Make Big Dataframes Flow Across Component Without Having To Resort To Define Temporary Datasets, Currently It'S Generating Only Raw Pickles.

Thanks @<1523701435869433856:profile|SmugDolphin23> , tho are you sure I don't need to override the deserialization function even if I pass multiple distinct objects as a tuple ?

one year ago
0 Hey, Trying To Figure Out How To Create An

Oh okay, my initial implementation was not far off:
` task = Task.init(project_name='VINZ', task_name=f'VINZ Retraining {datetime.now().strftime("%Y-%m-%d %H:%M:%S")}')
task.set_progress(0)

print("Training model...")
os.system(train_cmd)
print("✔️ Model trained!")

task.set_progress(75)

print("Converting model to ONNX...")
os.system(f"python export.py --weights {os.path.join(training_data_path, 'runs', 'train', 'yolov5s6_results', 'weights', 'best.pt')} --img...
2 years ago
0 I'Ve Been Using Clearml On On-Premise Machines And Would Now Like To Deploy Everything In Gcp (Deploy Everything New From Scratch, Don'T Care About Migrating The Data). I'Ve Followed Tutorial

It seems like it, cause it's impossible to access an IP directly through https without using a domain name without certificate, it will solve this immediate problem at least

one year ago
Show more results compactanswers