Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
RoughTiger69
Moderator
28 Questions, 101 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

90 × Eureka!
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
Hi, I am using PipelineDecorator to create tasks. is there a way to force it to use the entire git repo it is created from on the pythonpath? vs. just the de...
3 years ago
0 Votes
11 Answers
2K Views
0 Votes 11 Answers 2K Views
I have a local folder a, and a dataset B. a: a a/.DS_Store a/1.txt a/b a/b/.DS_Store a/b/1.txt a/b/c a/b/c/1.txtDataset B: b b/2.txt b/c b/c/2.txtI want to “...
3 years ago
0 Votes
18 Answers
2K Views
0 Votes 18 Answers 2K Views
Is there a case-study or ref. architecture for interacting with CI/CD i.e. exposing mature pipelines to be triggered upon code pushes (taking latest git hash...
4 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
hi, I am running a pipeline from decorators. the pipeline runs fine. Then I try to clone it by clicking the (successful) run and launching. The pipeline fail...
3 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
autoscaler 101 questions: What is the best practice for managing credentials so that they don’t get saved in clearml webapp? When the https://clear.ml/docs/l...
3 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
Hi, I am trying to use the aws autoscaler to assign instance profiles to new machines. This is a better way than managing credentials. I added the configurat...
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
3 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
autoscaler from saas (pro version). I attempted to use the autoscaler “application” from clearml UI. here is what I get in the logs of the autoscaler screen ...
3 years ago
0 Votes
12 Answers
1K Views
0 Votes 12 Answers 1K Views
Is there a reference implmentation for a task in a pipeline that awaits user input?
4 years ago
0 Votes
7 Answers
2K Views
0 Votes 7 Answers 2K Views
3 years ago
0 Votes
14 Answers
2K Views
0 Votes 14 Answers 2K Views
Two simple lineage related questions: Task B is a clone of Taks A. Does B store the information that it was cloned from A somewhere? Training task X loads Da...
4 years ago
0 Votes
4 Answers
2K Views
0 Votes 4 Answers 2K Views
hi, I created a dataset with 20K files, total of 20GB, with storage pointing to S3. When I upload (or close) the dataset, during the compression phase, the c...
3 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
4 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
Avoiding http://Clear.ml glue code spaghetti - community best practices? Say I have training pipeline : Task 1 - data preprocessing -> create a dataset artif...
4 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
hi folks, is there a way to force clear-ml agent with --docker to not create a virtualenv at all? And perhaps not even attempt to install requirements even? ...
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
3 years ago
0 Votes
8 Answers
2K Views
0 Votes 8 Answers 2K Views
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
what’s a good ami to use for the clearml autoscaler on AWS? the defaults offered confidently by the various auto scaler installers don’t seem to exist…| e.g....
3 years ago
0 Votes
9 Answers
2K Views
0 Votes 9 Answers 2K Views
3 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Hi, I am catching up with http://clear.ml for stuff beyond exp. tracking, and have a few questions. Will ask them separately to allow threading:
4 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Regarding the “classic” datasets (not hyper datasets): Is there an option to do something equivalent to dvc’s “ https://dvc.org/doc/user-guide/managing-exter...
3 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Did more digging, seems that you need to start the agent with CLEARML_AGENT_SKIP_PIP_VENV_INSTALL=1
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
FYI I am getting a lot of read timeouts from the community server: Retrying (Retry(total=235, connect=240, read=235, redirect=240, status=240)) after connect...
3 years ago
0 Votes
9 Answers
2K Views
0 Votes 9 Answers 2K Views
3 years ago
0 Votes
5 Answers
2K Views
0 Votes 5 Answers 2K Views
I have a logical task that I want to split to multiple workers. The task involves processing media files (not training). The optimal design for me would be: ...
3 years ago
0 Votes
4 Answers
2K Views
0 Votes 4 Answers 2K Views
hi, When running a training script from pycharm, it seems that clearml logs only those packages that are explicitly imported by my .py files; it seems to not...
4 years ago
0 Votes
14 Answers
2K Views
0 Votes 14 Answers 2K Views
question about pipeline and long-waiting tasks: Say I want to generate a dataset. The workflow I have requires query to a DB Creating a labeling assigment in...
3 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
3 years ago
4 years ago
0 Cannot Upload A Dataset With A Parent - Seems Very Odd! Clearml Versions I Tried: 1.6.1, 1.6.2 Scenario: * Create Parent Dataset (With Storage On S3) * Upload Data * Close Dataset * Create Child Dataset (Tried With Storage On Both S3 Or On Clearml Serv

I tested it again with much smaller data and it seems to work.
I am not sure what is the difference between the use-cases. it seems like something specifically about the particular (big) parent doesn’t agree with clearml…

3 years ago
0 Two Simple Lineage Related Questions:

I think that in principal, if you “intercept” the calls to Model.get() or Dataset.get() from within a task, you can collect the ID’s and do various stuff with them. You can store and visualize it for lineage, or expose it as another hyper parameter I suppose.

You’ll just need the user to name them as part of loading them in the code (in case they are loading multiple datasets/models).

4 years ago
4 years ago
0 Autoscaler From Saas (Pro Version). I Attempted To Use The Autoscaler “Application” From Clearml Ui. Here Is What I Get In The Logs Of The Autoscaler Screen Itself (Consistent):

CostlyOstrich36 from what I gather the UI creates a task in the background, in status “hidden”, and it has like 10 fields of json configurations…

3 years ago
0 Two Simple Lineage Related Questions:

Re. “which task did I clone from” - to my understanding “parent’ field is used for “runtime parent” - i.e. what task started me.
This is not the same as “which task was I cloned from”

4 years ago
4 years ago
0 Is It Possible To Use In Clearml

As far I know storage can be https://clear.ml/docs/latest/docs/integrations/storage/#direct-access .
typical EBS is limited to being mounted to 1 machine at a time.
so in this sense, it won’t be too easy to create a solution where multiple machines consume datasets from this storage type

PS https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volumes-multi.html is possible under some limitations

3 years ago
0 What’S A Good Ami To Use For The Clearml Autoscaler On Aws? The Defaults Offered Confidently By The Various Auto Scaler Installers Don’T Seem To Exist…| E.G.

AgitatedDove14 thanks, it was late and I wasn’t sure if I needed to use one of clearml “certified” AMI’s or just a vanilla one.

3 years ago
0 Hi, I Am Trying To Use The Aws Autoscaler To Assign Instance Profiles To New Machines. This Is A Better Way Than Managing Credentials. I Added The Configuration To The Autoscaler Config Like So:

Trust me, I had to add this field to this default dict just so that clearml doesn’t delete it for me
it does appear on the task in the UI, just somehow not repopulated in the remote run if it’s not a part of the default empty dict…

3 years ago
0 Question About Pipeline And Long-Waiting Tasks: Say I Want To Generate A Dataset. The Workflow I Have Requires

AgitatedDove14 thanks, good idea.

My main issue with this approach is that it breaks the workflow into “a-sync” set of tasks:

One task sends a list of images for labeling and terminates an external webhook calls http://clear.ml and creates a dataset from the labels returned from the labeling task a trigger wakes up the label post processing/splitting logic.
It will be hard to understand where things are standing from looking at the UI.

I was wondering if the “waiting” operator can actua...

3 years ago
0 I Have A Pipeline With Tasks A->B->C. I Want To Be Able To Trigger It Manually, And Skip A Regardless Of It’S Cache Status. I Want To Pass B Value That Represents A’S Output If Needed. What’S A Good Way To Achieve This (Can Be Ui-Based, Or Pipeline-Gymnas

AgitatedDove14
Sort of.
I would go with something which is more like:
` execution_plan = {'step_b':'b_result', step_c: None, ...}
@PipelineDecorator.pipeline(...)

def pipeline(execution_plan):
step_results = {}
for step in pipeline.get_dag():
if step.name in execution_plan.keys():
step_results[step.name] = execution_plan[step.name] or step(**step_results)

`The ‘execution plan’ specifies list of steps to run (keys) and for each, whether we should use a u...
3 years ago
0 Hi, I Am Using Pipelinedecorator To Create Tasks. Is There A Way To Force It To Use The Entire Git Repo It Is Created From On The Pythonpath? Vs. Just The Decorated Function And Perhaps The Helper_Function=[Some_Function]?

sure CostlyOstrich36
I have something like the following:

@PipelineDecorator.component(....) def my_task(...) from my_module1 import my_func1 from my_modeul2 import ....my_module1 and 2 are modules that are a part of the same project source. they don’t come as a separate package.

Now when I run this in clearml, these imports don’t work.

These functions may require transitive imports of course, so the following doesn’t work:
` PipelineDecorator.component(helper_function=[my_fu...

3 years ago
0 Question About Pipeline And Long-Waiting Tasks: Say I Want To Generate A Dataset. The Workflow I Have Requires

AgitatedDove14 1.1.5.
Yes - first locally, then it aborts (while running locally presumably).
then I re-enqueue it via the UI and it seems to run on the agent

3 years ago
0 Question About Pipeline And Long-Waiting Tasks: Say I Want To Generate A Dataset. The Workflow I Have Requires

AgitatedDove14

What was important for me was that the user can define the entire workflow and that I can see its status as one ‘pipeline’ in the UI (vs. disparate tasks).

perform query process records into a labeling assignment Call labeling system API wait for and external hook when labels are ready clean the labels upload them to a dataset
Do you know what specific API do I need to signal “resume” after “abort”?
not “reset” I presume?

3 years ago
0 Two Simple Lineage Related Questions:

Sure, but was wondering if it has more of a “first class citizen” status for tracking… e.g. something you can visualize in the UI or query via API

4 years ago
0 Hi, I Am Using Pipelinedecorator To Create Tasks. Is There A Way To Force It To Use The Entire Git Repo It Is Created From On The Pythonpath? Vs. Just The Decorated Function And Perhaps The Helper_Function=[Some_Function]?

AgitatedDove14 the emphasis is that the imports I am doing are not from external/pipe packages, they are just neighbouring modules to the function I am importing. Imports that rely on pip installed packages work well

3 years ago
0 2. Is There A Case-Study Or Ref. Architecture For Interacting With Ci/Cd I.E. Exposing Mature Pipelines To Be Triggered Upon Code Pushes (Taking Latest Git Hash) Or With Manual Ci Triggers?

IrritableGiraffe81 AgitatedDove14 there are multiple levels of what the CI/CD should automate/validate.
This one is the minimal option.
Another option is:
CI deploys (executes) the pipeline fresh, from the committed code http://2.CI waits and extracts the results (various artifacts, metrics etc.) CI compares them to the latest (published) pipeline or to absolute numbers CI decides if to publish it or not (or at least tag it as RC.Steps 2-4 can be themselves encapsulated in a clearml task ...

3 years ago
0 I Have A Local Folder A, And A Dataset B. A:

if the state is :
a:
a a/.DS_Store a/1.txt a/b a/b/.DS_Store a/b/1.txt a/b/c a/b/c/1.txtDataset B:
b b/2.txt b/c b/c/2.txtThen the command
mv b a/returns error since a/ is not empty.
That’s exactly the issue…

As a result, I need to do somethig which copies the files (e.g. cp -r or StorageManager.upload_folder(‘b’, ‘a’)
but this is expensive

3 years ago
Show more results compactanswers