Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ShallowGoldfish8
Moderator
8 Questions, 41 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

33 × Eureka!
0 Votes
2 Answers
983 Views
0 Votes 2 Answers 983 Views
When trying to run the server from the docker image ( docker-compose -f /opt/clearml/docker-compose.yml up -d as instructed in None ), I am getting an error ...
one year ago
0 Votes
15 Answers
1K Views
0 Votes 15 Answers 1K Views
2 years ago
0 Votes
16 Answers
965 Views
0 Votes 16 Answers 965 Views
Hi guys, I am having some trouble running some training scripts with the agent functionality: https://stackoverflow.com/questions/73279794/catboostclearml-er...
2 years ago
0 Votes
14 Answers
962 Views
0 Votes 14 Answers 962 Views
Is there any simple way to orchestrate a batch to train a model with different features (in order to do feature selection, for example) through a single .py ...
2 years ago
0 Votes
5 Answers
1K Views
0 Votes 5 Answers 1K Views
Is there a way to upload an artifact I forgot to upload during the task duration to that task after it is already complete?
one year ago
0 Votes
5 Answers
1K Views
0 Votes 5 Answers 1K Views
2 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
Is there a way to load only selected files and selected columns from a dataset (saved as multiple .parquet files) without having to download all of it?
2 years ago
0 Votes
3 Answers
1K Views
0 Votes 3 Answers 1K Views
2 years ago
0 Is There Any Simple Way To Orchestrate A Batch To Train A Model With Different Features (In Order To Do Feature Selection, For Example) Through A Single .Py File? I Saw The Following Example

I was checking here, and apparently if I use a parameter as suggested, together with a Task.init(task_name=f'{task name in this loop}') for each of the loops it should work, right? Creating different tasks in the server

2 years ago
0 Is There Any Simple Way To Orchestrate A Batch To Train A Model With Different Features (In Order To Do Feature Selection, For Example) Through A Single .Py File? I Saw The Following Example

Looks quite good indeed! Thanks! Is there in the repository the experiment template used in this example? Just not fully sure how the parameters are used/connected in it. Could I just build it and log these parameters using task.set_parameters() so that I call task.get_parameters() later?

2 years ago
0 Hi Guys, I Am Having Some Trouble Running Some Training Scripts With The Agent Functionality:

` from importlib.machinery import EXTENSION_SUFFIXES
import catboost
from clearml import Task, Logger, Dataset

import lightgbm as lgb
import numpy as np
import pandas as pd
import dask.dataframe as dd
import matplotlib.pyplot as plt

MODELS = {
'catboost': {
'model_class': catboost.CatBoostClassifier,
'file_extension': 'cbm'
},
'lgbm': {
'model_class': lgb.LGBMClassifier,
'file_extension': 'txt'
}
}

class ModelTrainer():
def init(sel...

2 years ago
2 years ago
0 Is There Any Simple Way To Orchestrate A Batch To Train A Model With Different Features (In Order To Do Feature Selection, For Example) Through A Single .Py File? I Saw The Following Example

yes, but is there a way to generate multiple tasks like I mentioned using task.init in different points of a .py and and run each of them as a separate remote exercution? Didn you just say that once I trigger the task.execute_remotely it will ignore the task.init?

2 years ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

Apparently the error comes when I try to access from get_model_and_features the pipeline component load_model . If it is not set as pipeline component and only as helper function (provided it is declared before the components that calls it (I already understood that and fixed, different from the code I sent above).

2 years ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

Additionally, I have the following error now:
` 2022-08-10 19:53:25,366 - clearml.Task - INFO - Waiting to finish uploads
2022-08-10 19:53:36,726 - clearml.Task - INFO - Finished uploading
Traceback (most recent call last):
File "/home/zanini/repo/RecSys/src/dataset/backtest.py", line 186, in <module>
backtest = run_backtest(
File "/home/zanini/repo/RecSys/.venv/lib/python3.9/site-packages/clearml/automation/controller.py", line 3329, in internal_decorator
a_pipeline.stop()
File...

2 years ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

I noticed that when a pipeline step returns an instance of a class, it tries to pickle. I am currently facing the issue with it not being able to pickle the output of the "load_baseline_model" function
` Traceback (most recent call last):
File "/tmp/tmpqr2zwiom.py", line 37, in <module>
task.upload_artifact(name=name, artifact_object=artifact)
File "/home/zanini/repo/RecSys/.venv/lib/python3.9/site-packages/clearml/task.py", line 1877, in upload_artifact
return self._artifacts_man...

2 years ago
0 Hi Guys, I Am Having Some Trouble Running Some Training Scripts With The Agent Functionality:

oooohhh.. you mean the key of the nested dict, that would make a lot of sense

2 years ago
0 Hi, Quickhelp With Pipelines: I Am Loading A Model During A State Of It And Them Passing This Model (Torch.Nn.Module Object) As Input Argument To A Pipeline Component. I Noticed The Model Inside The Pipeline Component Is An Object Of Class 'Pathlib2.Posix

Steps (pipeline components):
Load the model Infereces witht he model
Its equivalent to
model = Step1(*args) preds = Step2(model, *args)
After step 1, I have the model loaded as a torch object, as expected. When this object is passed to step 2, inside of step 2, it is read as an object of class 'pathlib2.PosixPath'.

I assume that is because there is some kind of problem in the pickling/loading/dumping of the inputs from a step to another in the pipeline. Is it some kind of known issue or ...

2 years ago
0 Hi There, I Am Intending To Work More Often With The Datasets, But Not Sure If There Is A Way To Retrieve Specific Files From A Uploaded Dataset. I Saw I Can Retrieve Chunks Of Data, But Not Sure How That Would Work With A Dataset Of Parquet Files. If I H

Could you supply any reference of this dataset containing other datasets? I might have skipped that when reading the documentation, but I do not recall seeing this functionality.

2 years ago
0 Hi There, I Am Intending To Work More Often With The Datasets, But Not Sure If There Is A Way To Retrieve Specific Files From A Uploaded Dataset. I Saw I Can Retrieve Chunks Of Data, But Not Sure How That Would Work With A Dataset Of Parquet Files. If I H

Apparently found out a solution:
dataset_zip = dataset._task.artifacts['data'].get() will return the path to the zip file containing all the files (that will be downloaded to the local machine)
after that:
import zipfile zip_file = zipfile.ZipFile(d, 'r') files = zip_file.namelist()retrieving the names of the files
unzip using
import os os.system(f'unzip {dataset_zip}') # in this case to your script directoryand using the files list one can them open them selectively

2 years ago
one year ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

That's the script that produces the error. You can also observe the struggle with importing the load_model function. (Any tips on best practices to structure the pipeline are also gladly accepted)

2 years ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

Should work as long as they are in the same file, you can however launch and wait any Task (see pipelines from tasks)

Do I call it as a function normally as in the other or do I need to import? (My initial problem was actually that is was not founding the other function as a pipeline component, so I thought it was not able to import as a secondary sub-component)

2 years ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

` import importlib
import argparse

from datetime import datetime
import pandas as pd

from clearml.automation.controller import PipelineDecorator
from clearml import TaskTypes, Task

@PipelineDecorator.component(
return_values=['model', 'features_to_build']
)
def get_model_and_features(task_id, model_type):
from clearml import Task
import sys
sys.path.insert(0,'/home/zanini/repo/RecSys')
from src.dataset.backtest import load_model

task = Task.get_task(task_id=task_i...
2 years ago
0 Task Struck At

My code pretty much createas a dataset, uploads it, trains a model (thats where the current task starts), evaluates it and upload all the artifacts and metrics. The artifacts and configurations are upload alright, but the metrics and plots are not. As with Lavi, my code hangs on the task.close(), where it seems to be waiting for the metrics, etc but never finishes. No retry message is shown as well.
After a print I added for debug right before task.close() the only message I get in the consol...

one year ago
0 Task Struck At

Also, I was using tensorboard

one year ago
0 Task Struck At

After commenting all the metric/plot reporting, we noticed the model was not uploading the artifacts to S3. A solution was to add wait_for_upload in task.upload_artifact()

one year ago
0 Task Struck At

Hi Martin, I updated clearml but the problem persists

one year ago
Show more results compactanswers