Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
HelpfulHare30
Moderator
10 Questions, 33 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

31 × Eureka!
0 Votes
6 Answers
950 Views
0 Votes 6 Answers 950 Views
Is there some support of multi-machine training on ClearML level?
3 years ago
0 Votes
5 Answers
977 Views
0 Votes 5 Answers 977 Views
Hi clearml. I'm trying to look at Datasets functionality (with the help of https://github.com/allegroai/clearml/blob/master/docs/datasets.md and https://clea...
3 years ago
0 Votes
5 Answers
935 Views
0 Votes 5 Answers 935 Views
Task.init() takes by default output_uri from clearml.conf configuration file (S3 bucket in my case). But underlined task created with Dataset.create() ignore...
3 years ago
0 Votes
10 Answers
986 Views
0 Votes 10 Answers 986 Views
Hi. I have a task executed on clearml-agent, configured with agent.package_manager.force_repo_requirements_txt = true . But requirements,txt is not taken int...
3 years ago
0 Votes
2 Answers
921 Views
0 Votes 2 Answers 921 Views
Hi all. Is there possibility to download all the artifacts from some task?
3 years ago
0 Votes
9 Answers
865 Views
0 Votes 9 Answers 865 Views
One more strange behaviour of clearml-agent. Two tasks with the same code (except usage of task.execute_remotely()) generate different structure of output ar...
3 years ago
0 Votes
8 Answers
904 Views
0 Votes 8 Answers 904 Views
Hi all. If I understand right, Dataset and any other task is Aborted after some time of inactivity. Can I configure it on some level (ideally on task level)?...
3 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Hi here. Is it possible to run clearml-agent with pre-ready pip virtual environment and avoid packages installation from requirments.txt/INSTALLED PACKAGES ?
2 years ago
0 Votes
6 Answers
987 Views
0 Votes 6 Answers 987 Views
Different question. How can I pass PYTHONPATH env variable to a task, run by agent (so python can find classes inside m subdirectories)?
2 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
Hi all. Is it possible to configure agent NOT to install requirements.txt/INSTALLED PACKAGES and to use some pre-ready environment (for example, virtualenv c...
3 years ago
0 Different Question. How Can I Pass Pythonpath Env Variable To A Task, Run By Agent (So Python Can Find Classes Inside M Subdirectories)?

AgitatedDove14 , you are right. It was invalid working directory. All works. Thank you

2 years ago
0 Hi Clearml. I'M Trying To Look At Datasets Functionality (With The Help Of

TimelyPenguin76 , thank you for explanation. 1). Great. 2) As you can see from my screenshot, Data Processing task is created but I don't see Datasets tab as I see in https://clear.ml/blog/construction-feat-tf2-object-detection-api/ 3) I see. So need to specify with every cli command/SDK method call

3 years ago
3 years ago
0 Hi All. If I Understand Right, Dataset And Any Other Task Is Aborted After Some Time Of Inactivity. Can I Configure It On Some Level (Ideally On Task Level)? I Have A Pipeline That Is Expected To Run Several Days.

Do I understand right that I can avoid task (including dataset termination if I update it somehow once a period (say, sending a log line)?

3 years ago
0 Is There Some Support Of Multi-Machine Training On Clearml Level?

SuccessfulKoala55 To be more specific, I mean situations when training is long and its parts can be parallelized in some way like in Spark or Dask. I suspect that such functionality is framework-specific and it's hard to believe it is in focus on ClearML that is more or less framework-agnostic. On the other hand, ClearML has many integrations with concrete frameworks. So I'd like to understand whether there is any kind of support on general ClearML level or as a part of integrations with fra...

3 years ago
3 years ago
0 Task.Init() Takes By Default Output_Uri From Clearml.Conf Configuration File (S3 Bucket In My Case). But Underlined Task Created With Dataset.Create() Ignores It And Uploads Files By Default To

Hi SuccessfulKoala55 Here is code_snipet
` task = Task.init(project_name=PROJECT_NAME, task_name=section)
task.connect(params)
print('params', params)

dataset = Dataset.create(dataset_name=params['dataset'], dataset_project=PROJECT_NAME)
dataset_local_dir = dataset.get_local_copy()

dataset._task.output_uri = task.output_uri

KeywordProcessor(params['es_host'], params['es_port'], True, DOCS_ROOT)

dataset.add_files(DOCS_ROOT, wildcard='*.csv')
dataset.upload() `I add several files to a da...

3 years ago
0 One More Strange Behaviour Of Clearml-Agent. Two Tasks With The Same Code (Except Usage Of Task.Execute_Remotely()) Generate Different Structure Of Output Artifacts. The Code Is A Regular Training With Keras That Includes Lines:

TimelyPenguin76 , sorry I didn't see this comment. No. I mean that when I run task locally (from PyCharm and without task.execute_remotely()), model is uploaded and registered. But when I do the same with task.execute_remotely() and it runs on agent model cannot be found in the task after this. I speak about the same script I sent in the second thread

3 years ago
0 Hi. I Have A Task Executed On Clearml-Agent, Configured With

TimelyPenguin76 , any news regarding this?

3 years ago
0 Hi. I Have A Task Executed On Clearml-Agent, Configured With

TimelyPenguin76 , thank you. Trying...

3 years ago
0 Hi

Just started with it but II'd like to chat

3 years ago
0 Hi All. If I Understand Right, Dataset And Any Other Task Is Aborted After Some Time Of Inactivity. Can I Configure It On Some Level (Ideally On Task Level)? I Have A Pipeline That Is Expected To Run Several Days.

SuccessfulKoala55 , I have the following structure now (maybe it's not best practice and you can suggest a better one). There is a sequence of tasks, that are run manually or from pipeline. Every task at the end updates some dataset. The dataset should be closed only after all the sequence is finished (and some task in the sequence can take more than two days). The issue I want to avoid is aborting of the dataset task that these regular tasks update.

3 years ago
0 Is There Any Way To Stop All Clearml Agent Workers On A Machine Or Stop Workers From The Clearml Ui?

VexedCat68 I would try to find the process on the machine with something like 'ps aux | grep clearml ' and kill it

2 years ago
0 Is There Some Support Of Multi-Machine Training On Clearml Level?

Hi AgitatedDove14 . Thank you. Yes. Pipeline means and clearml-agent on environment that runs some parallelization framework are options. I'll look in this direction

3 years ago
0 Hi Clearml. I'M Trying To Look At Datasets Functionality (With The Help Of

I didn't try yet but thought about dataset.upload(output_url=)

3 years ago
0 Hi. I Have A Task Executed On Clearml-Agent, Configured With

TimelyPenguin76 , thank you for willing to help. Here is a small project attached. load_mnist.py generates a dataset, model_train.py is the script in question (it uses the dataset generated by load_mnist.py)

3 years ago
0 Hi. I Have A Task Executed On Clearml-Agent, Configured With

` Current configuration (clearml_agent v0.17.2, location: /home/olga/clearml.conf):

api.version = 1.5
api.verify_certificate = true
api.default_version = 1.5
api.http.max_req_size = 15728640
api.http.retries.total = 240
api.http.retries.connect = 240
api.http.retries.read = 240
api.http.retries.redirect = 240
api.http.retries.status = 240
api.http.retries.backoff_factor = 1.0
api.http.retries.backoff_max = 120.0
api.http.wait_on_maintenance_forever = true
api.http.pool_...

3 years ago
Show more results compactanswers