Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
PanickyMoth78
Moderator
34 Questions, 167 Answers
  Active since 10 January 2023
  Last activity 4 months ago

Reputation

0

Badges 1

166 × Eureka!
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
2 years ago
0 Votes
7 Answers
1K Views
0 Votes 7 Answers 1K Views
Hi I'm looking into how clearml supports datasets and dataset versioning and I'm a bit confused. Is dataset versioning not supported at all in the non-enterp...
2 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
I am using the AWS autoscaler and I wish to set my files server to be gs. I tried to do so by having this in the ADDITIONAL CLEARML CONFIGURATION window: api...
2 years ago
0 Votes
20 Answers
1K Views
0 Votes 20 Answers 1K Views
task struck at task.flush(wait_for_uploads=True) : I've been running a model training task - a variation on this clearml dataset example: https://github.com/...
one year ago
0 Votes
22 Answers
1K Views
0 Votes 22 Answers 1K Views
Hi. I'm encountering a problem with model.name At least, for models that where auto-magically uploaded. I see it in my own code but you can see it if you run...
one year ago
0 Votes
7 Answers
979 Views
0 Votes 7 Answers 979 Views
Hi. I am experimenting with clearml.Dataset and encountering an error. LockException: [Errno 11] Resource temporarily unavailable In my experiment, I make a ...
2 years ago
0 Votes
2 Answers
985 Views
0 Votes 2 Answers 985 Views
Hi. I'm using @PipelineDecorator.component to define a task from a function (to run in a pipeline) I'd like to get the task object within this function so th...
2 years ago
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
Hi. Help 🥺 I have a clearml.Datase which I can't get
2 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
2 years ago
0 Votes
7 Answers
955 Views
0 Votes 7 Answers 955 Views
I have 5 unarchived pipeline runs that were defined with this decorator: @PipelineDecorator.pipeline( name="fastai_image_classification_pipeline", project="l...
2 years ago
0 Votes
14 Answers
1K Views
0 Votes 14 Answers 1K Views
Hi there. I'm trying to switch pipeline code from a local run using PipelineDecorator.run_locally()to a slightly-less-local run using PipelineDecorator.set_d...
2 years ago
0 Votes
7 Answers
990 Views
0 Votes 7 Answers 990 Views
Hi. I have a problem accessing repo code in pipeline components running in an AWS autoscaler (first attempts at doing this) My local clearml.conf file has ag...
2 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Is there some built-in way in clearml to trigger further action on task fail (or pipeline fail)?
2 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
2 years ago
0 Votes
4 Answers
432 Views
0 Votes 4 Answers 432 Views
Hi. I'm using clearml agent 1.16.1 My code is running a multi-process pool with "spawn" (see here for why) from multiprocessing import get_context ... with g...
4 months ago
0 Votes
30 Answers
1K Views
0 Votes 30 Answers 1K Views
Hi. I'd like to try the GCP autoscaler. What permissions does the service account that I provide to clearml need? (and what GCP API should I enable in the GC...
2 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
I have a training task that auto-magically saves a model for me to GCS task = Task.init( project_name=project_name, task_name=f"Image classification training...
one year ago
0 Votes
16 Answers
1K Views
0 Votes 16 Answers 1K Views
Hi. Question about Dataset upload errors: When uploading a clearml.Dataset created with output_uri=" gs://lavi_test/datasets after adding 20 files of size 50...
gcp
2 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
suppose I use a pipeline decorator to define a pipeline: @PipelineDecorator.pipeline(name='my-pipeline', project='my-project', version='0.2') def my_pipeline...
2 years ago
0 Votes
11 Answers
1K Views
0 Votes 11 Answers 1K Views
Hi. I have a few questions about the snippet attached re-running this code produces the same printouts... I chose 47 out of 100 in the pipeline ... I chose 8...
2 years ago
0 Votes
2 Answers
990 Views
0 Votes 2 Answers 990 Views
Hi. I've noticed that my clearml.conf has both: agent.git_user="" agent.git_pass=""and agent { ... git_user: "" git_pass: "" ... }What's the difference? Shou...
2 years ago
0 Votes
27 Answers
1K Views
0 Votes 27 Answers 1K Views
Hi. I'm running this little pipeline: from clearml.automation.controller import PipelineDecorator from clearml import TaskTypes @PipelineDecorator.component(...
2 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
2 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
Hi. Shoulf this command succeed in the presence of project lavi-testing and absence of dataset tmp_datset within it? from clearml import Dataset tmp_dataset ...
2 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
Hi (again... sorry for asking so many questions) Question about using google cloud storage in a clearml agent running in AWS ec2 instance. my clearml.conf ha...
2 years ago
0 Votes
9 Answers
1K Views
0 Votes 9 Answers 1K Views
Hi. I have a question about pipelines and their generated dependency graphs. I took the code of the clearml pipeline from decorator example: https://github.c...
2 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
Hi. First time user here 👋 I have experienced a problem following the getting started documentation. I opened an account on https://app.clear.ml/ I then fol...
2 years ago
0 Votes
14 Answers
1K Views
0 Votes 14 Answers 1K Views
Hi. I have a job that processes images and creates ~5 GB of processed image files (lots of small ones). At the end - it creates a clearml.Dataset and perform...
one year ago
0 Votes
3 Answers
939 Views
0 Votes 3 Answers 939 Views
2 years ago
0 Votes
14 Answers
984 Views
0 Votes 14 Answers 984 Views
Bug? dataset name is ignored if use_current_task=True
one year ago
Show more results questions
0 Task Struck At

no retry mesages
CLEARML_FILES_HOST is gs
CLEARML_API_HOST is a self hosted clearml server (in google compute engine).

Note that earlier in the process the code uploads a dataset just fine

one year ago
2 years ago
0 Hi. Help

It seems to be doing ok on the app side:
I didn't realise Datasets had tasks associated with them but there is one and it seems to be doing ok.
I've attached it's log file which only mentions skipping one file (a warning)

2 years ago
0 Hi. I'M Encountering A Problem With

BTW:

If I try to find the right model in the

task.models["output"]

(this time there is just one but in my code there may be several) it appears with the

(see other attached screenshot).

What would make sense here ? (I have to be honest I'm not sure).

If the model was saved with a file name (is that the trigger for auto-upload?), I think it makes sense for the model name to match the file name (not the task name), especially when there may be ...

one year ago
0 Hi (Again... Sorry For Asking So Many Questions) Question About Using Google Cloud Storage In A Clearml Agent Running In Aws Ec2 Instance. My

My local environment has clearml version 1.6.3rc0
and agents in aws were started with the AWS Autoscaler which has no explicit place for google credentials.

I see a place for Additional ClearML Configuration in the AWS autoscaler UI which I suspect may help but I don't see how I can pass a secrets file along with my agent.

2 years ago
0 Hi (Again... Sorry For Asking So Many Questions) Question About Using Google Cloud Storage In A Clearml Agent Running In Aws Ec2 Instance. My

I now get this error:
2022-07-18 21:51:29,168 - clearml.storage - ERROR - Failed creating storage object Reason: [Errno 2] No such file or directory: '~/gs.cred'
to be clear, I replaced <this is your GCP storage credentials file> with the contents of that file, escaping every " with a \" and removing newlines.

2 years ago
0 Another Question On The Topic Of How A Remote Execution Of A Pipeline Kills The Calling Process (Previously Discussed

on the same topic. What if (I were able to iterate and) I wanted the pipelines calls to be blocking so that the next pipeline executes only after the previous one completes?

2 years ago
0 Is There Some Built-In Way In Clearml To Trigger Further Action On Task Fail (Or Pipeline Fail)?

Yes.
Some mechanism that would allow for followup code execution. Ideally in a way that would not be susceptible to the same things that may cause a task to fail.

2 years ago
0 Autoscaler Parallelization Issue: I Have An Aws Autoscaler Set Up With A Resource That Has A Max Of 3 Instances Assigned To The

sys.path.insert(0, "/src/clearml_evaluation/") is actually left-over code from when I was making things run locally (perhaps prior to connecting to github repo) but I think that adding a non-existent path to the system path would be benign

2 years ago
0 Hi There. I'M Trying To Switch Pipeline Code From A Local Run Using

actually, re-running pipeline_from_decorator.py a second time (and a third time) from the command line seem to have executed without the that ValueError so maybe that issue was some fluke.
Nevertheless, those runs exit prior to line
print('process completed')
and I would definitely prefer the command executing_pipeline to not kill the process that called it.
For example, maybe, having started the pipeline I'd like my code to also report having started the pipeline to som...

2 years ago
0 Hi. I'M Using

For component
task=Task.current_task()Will get me the task object. (right?)
This does not work for pipeline. Is pipeline a task?
Edit: The same works for pipeline

2 years ago
0 Hi. Help

sorry..

2 years ago
0 Hi. Help

I had several pipeline components getting it and uploading files to is concurrently.
Can Datsets handle that?

2 years ago
0 Hi. I Have A Few Questions About The Snippet Attached

perhaps anecdotal but just calling random.seed() will set the seed using the system time for you
https://docs.python.org/3/library/random.html#random.seed

2 years ago
0 I Have 5 Unarchived Pipeline Runs That Were Defined With This Decorator:

Hi John. sort of. It seems that archiving pipelines does not also archive the tasks that they contain so /projects/lavi-testing/.pipelines/fastai_image_classification_pipeline is a very long list..

2 years ago
0 Task Struck At

there may have been some interaction between the training task and a preceding dataset creation task :shrug:

one year ago
0 Hi. I'M Encountering A Problem With

To be specific there is "model name" which is not unique , and there is model-key which is unique to the Task

not sure why the two fields don't simply match. I guess that there may be situations where file name (without the full path) may be used several times.

one year ago
0 Autoscaler Parallelization Issue: I Have An Aws Autoscaler Set Up With A Resource That Has A Max Of 3 Instances Assigned To The

erm,
this parallelization has led to the pipeline task issuing a bunch of:
model_path/run_2022_07_20T22_11_15.209_0.zip , err: [Errno 28] No space left on deviceand quitting on me.
my train_image_classifier_component is programmed to save model files to a local path which is returned (and, thanks to clearml, the path's contents are zipped uploded to the files service).

I take it that these files are also brought into pipeline tasks's local disk?
Why is that? If that is indeed what...

2 years ago
0 Hi. I'D Like To Try The Gcp Autoscaler.

Hi TimelyPenguin76
Thanks for working on this. The clearml gcp autoscaler is a major feature for us to have. I can't really evaluate clearml without some means of instantiating multiple agents on GCP machines and I'd really prefer not to have to set up a k8 cluster with agents and manage scaling it myself.

I tried the settings above with two resources, one for default queue and one for the services queue (making sure I use that image you suggested above for both).
The autoscaler started up...

2 years ago
0 Hi. I'D Like To Try The Gcp Autoscaler.

I'll try a more carefully checked run a bit later but I know it's getting a bit late in your time zone

2 years ago
0 Hi There. I'M Trying To Switch Pipeline Code From A Local Run Using

first, thanks for having these discussions. I appreciate this kind of support is an effort 🙏
Yes. i perfectly understand that once a pipeline job (or a task) is sent off in this manner, it executes separately (and, most likely in a different machine) from the process that instantiated it.
I still feel strongly that such a command should not be thought of as a fire and exit operation. I can think of several scenarios where continued execution of the instantiating process is desired:
I ...

2 years ago
0 Hi. I'M Encountering A Problem With

I imagine that one workaround is to
Disable automatic model uploads Perform manual model upload (with the correct name).Can you point me to how to do these?

one year ago
0 Bug?

hmm.
this isn't supported though:
dataset_args = dataset.connect(dataset_args)

one year ago
0 Hi There. I'M Trying To Switch Pipeline Code From A Local Run Using

What I think would be preferable is that the pipeline be deployed and that the python process that deployed it were allowed to continue on to whatever I had planned for it to do next (i.e. not exit)

2 years ago
Show more results compactanswers