Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ShallowGoldfish8
Moderator
8 Questions, 41 Answers
  Active since 10 January 2023
  Last activity 7 months ago

Reputation

0

Badges 1

33 × Eureka!
0 Votes
5 Answers
584 Views
0 Votes 5 Answers 584 Views
Is there a way to upload an artifact I forgot to upload during the task duration to that task after it is already complete?
one year ago
0 Votes
2 Answers
586 Views
0 Votes 2 Answers 586 Views
Is there a way to load only selected files and selected columns from a dataset (saved as multiple .parquet files) without having to download all of it?
one year ago
0 Votes
14 Answers
555 Views
0 Votes 14 Answers 555 Views
Is there any simple way to orchestrate a batch to train a model with different features (in order to do feature selection, for example) through a single .py ...
one year ago
0 Votes
15 Answers
685 Views
0 Votes 15 Answers 685 Views
one year ago
0 Votes
2 Answers
518 Views
0 Votes 2 Answers 518 Views
When trying to run the server from the docker image ( docker-compose -f /opt/clearml/docker-compose.yml up -d as instructed in None ), I am getting an error ...
7 months ago
0 Votes
5 Answers
662 Views
0 Votes 5 Answers 662 Views
one year ago
0 Votes
3 Answers
639 Views
0 Votes 3 Answers 639 Views
one year ago
0 Votes
16 Answers
567 Views
0 Votes 16 Answers 567 Views
Hi guys, I am having some trouble running some training scripts with the agent functionality: https://stackoverflow.com/questions/73279794/catboostclearml-er...
one year ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

Additionally, I have the following error now:
` 2022-08-10 19:53:25,366 - clearml.Task - INFO - Waiting to finish uploads
2022-08-10 19:53:36,726 - clearml.Task - INFO - Finished uploading
Traceback (most recent call last):
File "/home/zanini/repo/RecSys/src/dataset/backtest.py", line 186, in <module>
backtest = run_backtest(
File "/home/zanini/repo/RecSys/.venv/lib/python3.9/site-packages/clearml/automation/controller.py", line 3329, in internal_decorator
a_pipeline.stop()
File...

one year ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

` import importlib
import argparse

from datetime import datetime
import pandas as pd

from clearml.automation.controller import PipelineDecorator
from clearml import TaskTypes, Task

@PipelineDecorator.component(
return_values=['model', 'features_to_build']
)
def get_model_and_features(task_id, model_type):
from clearml import Task
import sys
sys.path.insert(0,'/home/zanini/repo/RecSys')
from src.dataset.backtest import load_model

task = Task.get_task(task_id=task_i...
one year ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

Should work as long as they are in the same file, you can however launch and wait any Task (see pipelines from tasks)

Do I call it as a function normally as in the other or do I need to import? (My initial problem was actually that is was not founding the other function as a pipeline component, so I thought it was not able to import as a secondary sub-component)

one year ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

Apparently the error comes when I try to access from get_model_and_features the pipeline component load_model . If it is not set as pipeline component and only as helper function (provided it is declared before the components that calls it (I already understood that and fixed, different from the code I sent above).

one year ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

That's the script that produces the error. You can also observe the struggle with importing the load_model function. (Any tips on best practices to structure the pipeline are also gladly accepted)

one year ago
0 1St: Is It Possible To Make A Pipeline Component Call Another Pipeline Component (As A Substep)? Or Only The Controller Can Do It? 2Nd: I Am Trying To Call A Function Defined In The Same Script, But Unable To Import It. I Passing The Repo Parameter To The

I noticed that when a pipeline step returns an instance of a class, it tries to pickle. I am currently facing the issue with it not being able to pickle the output of the "load_baseline_model" function
` Traceback (most recent call last):
File "/tmp/tmpqr2zwiom.py", line 37, in <module>
task.upload_artifact(name=name, artifact_object=artifact)
File "/home/zanini/repo/RecSys/.venv/lib/python3.9/site-packages/clearml/task.py", line 1877, in upload_artifact
return self._artifacts_man...

one year ago
one year ago
0 Hi There, I Am Intending To Work More Often With The Datasets, But Not Sure If There Is A Way To Retrieve Specific Files From A Uploaded Dataset. I Saw I Can Retrieve Chunks Of Data, But Not Sure How That Would Work With A Dataset Of Parquet Files. If I H

Apparently found out a solution:
dataset_zip = dataset._task.artifacts['data'].get() will return the path to the zip file containing all the files (that will be downloaded to the local machine)
after that:
import zipfile zip_file = zipfile.ZipFile(d, 'r') files = zip_file.namelist()retrieving the names of the files
unzip using
import os os.system(f'unzip {dataset_zip}') # in this case to your script directoryand using the files list one can them open them selectively

one year ago
0 Hi There, I Am Intending To Work More Often With The Datasets, But Not Sure If There Is A Way To Retrieve Specific Files From A Uploaded Dataset. I Saw I Can Retrieve Chunks Of Data, But Not Sure How That Would Work With A Dataset Of Parquet Files. If I H

Could you supply any reference of this dataset containing other datasets? I might have skipped that when reading the documentation, but I do not recall seeing this functionality.

one year ago
0 Hi Guys, I Am Having Some Trouble Running Some Training Scripts With The Agent Functionality:

` from importlib.machinery import EXTENSION_SUFFIXES
import catboost
from clearml import Task, Logger, Dataset

import lightgbm as lgb
import numpy as np
import pandas as pd
import dask.dataframe as dd
import matplotlib.pyplot as plt

MODELS = {
'catboost': {
'model_class': catboost.CatBoostClassifier,
'file_extension': 'cbm'
},
'lgbm': {
'model_class': lgb.LGBMClassifier,
'file_extension': 'txt'
}
}

class ModelTrainer():
def init(sel...

one year ago
0 Hi Guys, I Am Having Some Trouble Running Some Training Scripts With The Agent Functionality:

Simplified a little bit and removed private parameters, but thats pretty much the code. We did not try with toy examples, since that was already done with the example pipelines when we implemented and the model training itself is quite simple basic there already (only few hyperparameters set)

one year ago
0 Hi Guys, I Am Having Some Trouble Running Some Training Scripts With The Agent Functionality:

That would make sense, although clearml, at least on UI, shows the deeper level of the nested dict as a int, as one would expect

one year ago
0 Hi Guys, I Am Having Some Trouble Running Some Training Scripts With The Agent Functionality:

oooohhh.. you mean the key of the nested dict, that would make a lot of sense

one year ago
0 Hi Guys, I Am Having Some Trouble Running Some Training Scripts With The Agent Functionality:

Martin, if you want, feel free to add your answer in the stackoverflow so that I can mark it as a solution.

one year ago
one year ago
0 Is There Any Simple Way To Orchestrate A Batch To Train A Model With Different Features (In Order To Do Feature Selection, For Example) Through A Single .Py File? I Saw The Following Example

I was checking here, and apparently if I use a parameter as suggested, together with a Task.init(task_name=f'{task name in this loop}') for each of the loops it should work, right? Creating different tasks in the server

one year ago
0 Is There Any Simple Way To Orchestrate A Batch To Train A Model With Different Features (In Order To Do Feature Selection, For Example) Through A Single .Py File? I Saw The Following Example

Looks quite good indeed! Thanks! Is there in the repository the experiment template used in this example? Just not fully sure how the parameters are used/connected in it. Could I just build it and log these parameters using task.set_parameters() so that I call task.get_parameters() later?

one year ago
0 Is There Any Simple Way To Orchestrate A Batch To Train A Model With Different Features (In Order To Do Feature Selection, For Example) Through A Single .Py File? I Saw The Following Example

yes, but is there a way to generate multiple tasks like I mentioned using task.init in different points of a .py and and run each of them as a separate remote exercution? Didn you just say that once I trigger the task.execute_remotely it will ignore the task.init?

one year ago
Show more results compactanswers