Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
FierceHamster54
Moderator
29 Questions, 178 Answers
  Active since 10 January 2023
  Last activity 10 months ago

Reputation

0

Badges 1

131 × Eureka!
0 Votes
2 Answers
960 Views
0 Votes 2 Answers 960 Views
Hey, it is said in the pipeline decorator example that requirements for executing a pipeline component is inferred from the imports inside the component func...
2 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hey, is there a way to pass docker args for the execution of a pipeline controller defined through decorator @PipelineDecorator.pipeline the same way we can ...
2 years ago
0 Votes
8 Answers
952 Views
0 Votes 8 Answers 952 Views
2 years ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
2 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
Hey, is there a shortcut on the Dataset SDK to directly get the latest version of a dataset ?
2 years ago
0 Votes
4 Answers
938 Views
0 Votes 4 Answers 938 Views
one year ago
0 Votes
9 Answers
977 Views
0 Votes 9 Answers 977 Views
Hey just wanting to know: what is the recommended best practice to write ClearML Pipelines between controller and decorators ?
2 years ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
Hey all, hope you're having a great day, having an unexpected behavior with a training task of a YOLOv5 model on my pipeline, I specified a task in my traini...
one year ago
0 Votes
5 Answers
995 Views
0 Votes 5 Answers 995 Views
Hey, would it be possible to add a way to edit autoscaler configs without having to clone them ? This is really frustrating especially when you reached the q...
2 years ago
0 Votes
4 Answers
940 Views
0 Votes 4 Answers 940 Views
2 years ago
0 Votes
12 Answers
941 Views
0 Votes 12 Answers 941 Views
Hey has anyone managed to capture Darts logging with ClearML when using the temporal fusion transformers ? Even when overriding their trainer with a custom P...
one year ago
0 Votes
25 Answers
995 Views
0 Votes 25 Answers 995 Views
Hey, trying to figure out how to create an https://clear.ml/docs/latest/docs/clearml_sdk/model_sdk#output-models , the doc says it needs a TaskId but my trai...
2 years ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
Hey having an issue passing parameters to a component in a pipeline, the parameters appear to be None inside the component function: @PipelineDecorator.compo...
2 years ago
0 Votes
20 Answers
963 Views
0 Votes 20 Answers 963 Views
Hey currently trying to run a pipeline locally to test a pipeline component with PipelineDecorator.run_locally() , first try returned a random pandas error, ...
2 years ago
0 Votes
6 Answers
940 Views
0 Votes 6 Answers 940 Views
Hey guys, i hope you all have a nice days, I had to use the Task method .setup_aws_upload(bucket=...,region=...) to overcome a incorrect region specified for...
one year ago
0 Votes
4 Answers
1K Views
0 Votes 4 Answers 1K Views
2 years ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
Heya, trying to setup a GCP autoscaler for general-purpose CPU instances ( e2-standard-4 ) but I get into that error: googleapiclient.errors.HttpError: And I...
2 years ago
0 Votes
1 Answers
930 Views
0 Votes 1 Answers 930 Views
Hey is ClearML MLFlow based ? Is it exposed to CVE-2023-1176 and CVE-2023-1177 ?
one year ago
0 Votes
8 Answers
572 Views
0 Votes 8 Answers 572 Views
Hey everyone, I am having some difficulties passing environment variables on my piepeline components running on agents ( 1.6.1 ) without docker mode: - I set...
11 months ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
2 years ago
0 Votes
6 Answers
637 Views
0 Votes 6 Answers 637 Views
10 months ago
0 Votes
7 Answers
986 Views
0 Votes 7 Answers 986 Views
Hey I hope everyone is having a good day, two quick questions about datasets: Does squashing two datasets deletes the two original datasets ? Is it possible ...
2 years ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
Heya, I hope you're all well in this beautiful day, my GCP Autoscaler just died with that strange but short backtrace, wondered if it rang a bell to any of y...
2 years ago
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
Hello everyone! I setup a GCP autoscaler on my Pro SaaS deployment but I keep getting this error: clearml_agent: ERROR: Server does not support --use-owner-t...
2 years ago
0 Votes
10 Answers
1K Views
0 Votes 10 Answers 1K Views
2 years ago
0 Votes
1 Answers
952 Views
0 Votes 1 Answers 952 Views
Hey is .get_local_copy() thread-safe ? I mean can I concurrently download several datasets without breaking the cache and StorageManager ?
2 years ago
0 Votes
7 Answers
708 Views
0 Votes 7 Answers 708 Views
Hey everyone, As a Pro-tier SaaS user, I'm experiencing a very high latency when finalizing a dataset, it is attached in a big dataset version hierarchy and ...
11 months ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
Heya, good day to everyone, I'm finding myself facing this random error with a very opaque backtrace when attempting to squash two distinct versions of the s...
2 years ago
0 Votes
7 Answers
972 Views
0 Votes 7 Answers 972 Views
Heya, is there any plan for ClearML to leverage the new https://developer.nvidia.com/blog/getting-the-most-out-of-the-a100-gpu-with-multi-instance-gpu/ tech ...
2 years ago
0 Hey, Is There A Way To Pass Docker Args For The Execution Of A Pipeline Controller Defined Through Decorator

Takling about that decorator which shouyld also have a docker_arg param since it is executed as an "orchestration component" but the param is missing https://clear.ml/docs/latest/docs/references/sdk/automation_controller_pipelinecontroller/#pipelinedecoratorpipeline

2 years ago
0 Heya, Trying To Setup A Gcp Autoscaler For General-Purpose Cpu Instances (

This is funny cause the auto-scaler on GPU instances is working fine, but as the backtrace suggests it seems to be linked to this instance family

2 years ago
0 Hey, Is There A Way To Set Pipeline Component Return Artifact Compression At A Pipeline Level ? It Would Allow To Make Big Dataframes Flow Across Component Without Having To Resort To Define Temporary Datasets, Currently It'S Generating Only Raw Pickles.

Thanks @<1523701435869433856:profile|SmugDolphin23> , tho are you sure I don't need to override the deserialization function even if I pass multiple distinct objects as a tuple ?

10 months ago
0 <no title>

Did you properly install Docker and Docker nvidia toolkit ? here's the init script i'm using on my autoscaled workers:

#!/bin/sh

sudo apt-get update -y

sudo apt-get install -y \
    ca-certificates \
    curl \
    gnupg \
    lsb-release

sudo mkdir -p /etc/apt/keyrings

curl -fsSL 
 | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg

echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] 
 \
  $(lsb_release -cs) stable" | s...
one year ago
0 [Datasets] Is It Possible To Get An Individual File From A Dataset? Example Would Be Accessing Only A Single Feature From A Feature Store Dataset When It Could Be Costly To Download The Entire Dataset

I suppose your worker is not persistent, so I might suggest having a very cheap instance as a persistent worker where you have your dataset persistently synced using . https://clear.ml/docs/latest/docs/references/sdk/dataset/#sync_folder and then taking the subset of files that interests you and pushing it as a different dataset, marking it as a subset of your main dataset id using a tag

one year ago
0 Heya, Trying To Setup A Gcp Autoscaler For General-Purpose Cpu Instances (

AnxiousSeal95 Okay it seems to work with a compute optimized c2-standard-4 instance

2 years ago
0 Hey, Is There A Way To Pass Docker Args For The Execution Of A Pipeline Controller Defined Through Decorator

Well I simply duplicated code across my components instead of centraliwing the operations that needed that env variable in the controller

2 years ago
0 Hi, First Time Here

Nope but it can manage your cloud vms for you

2 years ago
0 Hi Folks! Is There A Way To Programmatically Create Users And Groups On A Server? If So, Can Someone Point Me To The Docs? Thanks

It doesnt seem so if you look at the REST api documentation, might be available as an ENterprise plan feature

9 months ago
0 Heya, I Hope You'Re All Well In This Beautiful Day, My Gcp Autoscaler Just Died With That Strange But Short Backtrace, Wondered If It Rang A Bell To Any Of You ?

This is an instance than I launched like last week and was running fine until now, the version is v1.6.0-335

2 years ago
0 Hey, Don'T Really Understand Why The Clearml Worker Needs To Pull The Repository Where My Pipeline (Defined With Decorators) Is Written Is Since Apparently A Temporary Python File (Containing At Least The Code And Imports For The Executed Component) Seems

Well aside from the abvious removal of the line PipelineDecorator.run_locally() on both our sides, the decorators arguments seems to be the same:
@PipelineDecorator.component( return_values=['dataset_id'], cache=True, task_type=TaskTypes.data_processing, execution_queue='Quad_VCPU_16GB', repo=False )And my pipeline controller:
` @PipelineDecorator.pipeline(
name="VINZ Auto-Retrain",
project="VINZ",
version="0.0.1",
pipeline_execution_queue="Quad_V...

2 years ago
2 years ago
0 Hey Everyone, I Am Having Some Difficulties Passing Environment Variables On My Piepeline Components Running On Agents (

Oh, it's a little strange the comment lines about it were in the agent section

11 months ago
0 Hi, Is There Any Way To Upload Data To A Clearml Dataset Without Compression At All? I Have Very Small Text Files That Make Up A Dataset And Compression Seems To Take Most Of The Upload Time And It Provide Almost No Benefits W.R.T Size

The default compression parameter value is ZIP_MINIMAL_COMPRESSION , I guess you could try to check if there is a Tarball only option but anyway most of the CPU time took by the upload process is the generation of the hashes of the file entries

2 years ago
0 Hey, About The Dependency Propagation Of Pipeline Components, If I Call A Vanilla Python Function From A Component Does The Dependencies Specified In The Internal Imports Propagated To This Function Call Too ? And Additionally If That Function Is In Anoth

Well given a file architecture looking like this:
|_ __init__.py |_ my_pipeline.py |_ my_utils.py
With the content of my_pipeline.py being:
` from clearml.automation.controller import PipelineDecorator
from clearml import Task, TaskTypes

from my_utils import do_thing

Task.force_store_standalone_script()

@PipelineDecorator.component(...)
def my_component(dataset_id: str):
import pandas as pd
from clearml import Dataset

dataset = Dataset.get(dataset_id=input_dataset_id...
2 years ago
0 Hey, About The Dependency Propagation Of Pipeline Components, If I Call A Vanilla Python Function From A Component Does The Dependencies Specified In The Internal Imports Propagated To This Function Call Too ? And Additionally If That Function Is In Anoth

Okay looks like the call dependency resolver does not supports cross-file calls and relies instead on the local repo cloning feature to handle multiple files so the Task.force_store_standalone_script() does not allow for a pipeline defined cross multiple files (now that you think of it it was kinda implied by the name), but what is interesting is that calling an auxiliary function in the SAME file from a component also raise a NameError: <function_name> is not defined , that's ki...

2 years ago
0 Hey, About The Dependency Propagation Of Pipeline Components, If I Call A Vanilla Python Function From A Component Does The Dependencies Specified In The Internal Imports Propagated To This Function Call Too ? And Additionally If That Function Is In Anoth

Well it is also failing within the same file if you read until the end, but for the cross-file issue, it's mostly because of my repo architecture organized in a v1/v2 scheme and I didn't want to pull a lot of unused files and inject github PATs that frankly lack gralunarity in the worker

2 years ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

I would try to not run it locally but in your execution queues on a remote worker, if that's not it it is likely a bug

one year ago
Show more results compactanswers