Hi @<1610083503607648256:profile|DiminutiveToad80> ! You need to somehow serialize the object. Note that we try different serialization methods and default to pickle if none work. If pickle doesn't work then the artifact can't be uploaded by default. But there is a way around it: you can serialize the object yourself. The recommended way to do this is using the serialization_function
argument in upload_artifact
. You could try using something like dill
which can serialize more objects than pickle
.
Note that you need to specify a deserialization_function
when pulling the artifact.
Hi @<1610083503607648256:profile|DiminutiveToad80> , I'd suggest using the Datasets feature. But you can however of course upload it as artifacts.
Where are you trying to upload it? Can you provide the full log? Also, a code snippet would help.
dataset = fo.Dataset.from_dir(
labels_path=labels_path,
dataset_type=fo.types.COCODetectionDataset,
label_field="ground_truth",
use_polylines=True
)
task.upload_artifact(
name="Dataset",
artifact_object=dataset,
)
File "/opt/conda/envs/bumlo/lib/python3.10/site-packages/clearml/binding/artifacts.py", line 745, in upload_artifact
pickle.dump(artifact_object, f)
_pickle.PicklingError: Can't pickle <class 'mongoengine.base.metaclasses.samples.6627e5ecc60879fe5e49cee6'>: attribute lookup samples.6627e5ecc60879fe5e49cee6 on mongoengine.base.metaclasses failed
@<1523701070390366208:profile|CostlyOstrich36>