
Reputation
Badges 1
33 × Eureka!I noticed that when a pipeline step returns an instance of a class, it tries to pickle. I am currently facing the issue with it not being able to pickle the output of the "load_baseline_model" function
` Traceback (most recent call last):
File "/tmp/tmpqr2zwiom.py", line 37, in <module>
task.upload_artifact(name=name, artifact_object=artifact)
File "/home/zanini/repo/RecSys/.venv/lib/python3.9/site-packages/clearml/task.py", line 1877, in upload_artifact
return self._artifacts_man...
` import importlib
import argparse
from datetime import datetime
import pandas as pd
from clearml.automation.controller import PipelineDecorator
from clearml import TaskTypes, Task
@PipelineDecorator.component(
return_values=['model', 'features_to_build']
)
def get_model_and_features(task_id, model_type):
from clearml import Task
import sys
sys.path.insert(0,'/home/zanini/repo/RecSys')
from src.dataset.backtest import load_model
task = Task.get_task(task_id=task_i...
did manage to get it working, but only by hardcoding the path of the repository using sys.path.append()
with absolute repository path on my machine
The error comes out after the execution of the component backtest_prod
It is an instance of a custom class.
That's the script that produces the error. You can also observe the struggle with importing the load_model function. (Any tips on best practices to structure the pipeline are also gladly accepted)
I saw regarding the chunks, but it is not clear how one can retrieve the dataset based on files
Apparently found out a solution:dataset_zip = dataset._task.artifacts['data'].get()
will return the path to the zip file containing all the files (that will be downloaded to the local machine)
after that:import zipfile zip_file = zipfile.ZipFile(d, 'r') files = zip_file.namelist()
retrieving the names of the files
unzip usingimport os os.system(f'unzip {dataset_zip}') # in this case to your script directory
and using the files
list one can them open them selectively
sorted by using command below before docker-compose callexport DOCKER_DEFAULT_PLATFORM=linux/amd64
But how do I link it to the specific task to be listed as artifact?
yes, variations of the data, using only a subset of the features
It worked!
That would make sense, although clearml, at least on UI, shows the deeper level of the nested dict as a int, as one would expect
Additionally, I have the following error now:
` 2022-08-10 19:53:25,366 - clearml.Task - INFO - Waiting to finish uploads
2022-08-10 19:53:36,726 - clearml.Task - INFO - Finished uploading
Traceback (most recent call last):
File "/home/zanini/repo/RecSys/src/dataset/backtest.py", line 186, in <module>
backtest = run_backtest(
File "/home/zanini/repo/RecSys/.venv/lib/python3.9/site-packages/clearml/automation/controller.py", line 3329, in internal_decorator
a_pipeline.stop()
File...
yes, but is there a way to generate multiple tasks like I mentioned using task.init in different points of a .py and and run each of them as a separate remote exercution? Didn you just say that once I trigger the task.execute_remotely it will ignore the task.init?
Could you supply any reference of this dataset containing other datasets? I might have skipped that when reading the documentation, but I do not recall seeing this functionality.
Should work as long as they are in the same file, you can however launch and wait any Task (see pipelines from tasks)
Do I call it as a function normally as in the other or do I need to import? (My initial problem was actually that is was not founding the other function as a pipeline component, so I thought it was not able to import as a secondary sub-component)
` from importlib.machinery import EXTENSION_SUFFIXES
import catboost
from clearml import Task, Logger, Dataset
import lightgbm as lgb
import numpy as np
import pandas as pd
import dask.dataframe as dd
import matplotlib.pyplot as plt
MODELS = {
'catboost': {
'model_class': catboost.CatBoostClassifier,
'file_extension': 'cbm'
},
'lgbm': {
'model_class': lgb.LGBMClassifier,
'file_extension': 'txt'
}
}
class ModelTrainer():
def init(sel...
regarding (2), if use run_remote, does it also ignore the init?
Considering something along the lines of
https://github.com/allegroai/clearml/blob/master/examples/advanced/execute_remotely_example.py
oooohhh.. you mean the key of the nested dict, that would make a lot of sense
Martin, if you want, feel free to add your answer in the stackoverflow so that I can mark it as a solution.
Apparently the error comes when I try to access from get_model_and_features
the pipeline component load_model
. If it is not set as pipeline component and only as helper function (provided it is declared before the components that calls it (I already understood that and fixed, different from the code I sent above).
I will try the suggested edit here
Looks quite good indeed! Thanks! Is there in the repository the experiment template used in this example? Just not fully sure how the parameters are used/connected in it. Could I just build it and log these parameters using task.set_parameters()
so that I call task.get_parameters()
later?
Steps (pipeline components):
Load the model Infereces witht he model
Its equivalent tomodel = Step1(*args) preds = Step2(model, *args)
After step 1, I have the model loaded as a torch object, as expected. When this object is passed to step 2, inside of step 2, it is read as an object of class 'pathlib2.PosixPath'.
I assume that is because there is some kind of problem in the pickling/loading/dumping of the inputs from a step to another in the pipeline. Is it some kind of known issue or ...
Simplified a little bit and removed private parameters, but thats pretty much the code. We did not try with toy examples, since that was already done with the example pipelines when we implemented and the model training itself is quite simple basic there already (only few hyperparameters set)