Hi @<1547028031053238272:profile|MassiveGoldfish6> , I think you can disable the auto logging of lightning artifacts automatically using the auto_connect_frameworks parameter of the component.
Hello,
I am running into an issue with ClearML pipelines
I have a data_prepping step which contains a LightningDataModule. In it,
I load the data and prep it. My function then returns an initialized
datamodule which i give to the training function. I have PipelineDecorator.component(task_type = TaskTypes.data_processing,cache= False) , . When I am done training, the pipeline saves my entire dataset(64GB) as an artifact and I am not sure why. Would you happen to know what I am doing wrong? Would you have a example of how the pipeline decorator is used with a Pytorch Lightning ML pipeline?
I should add that I already have a ClearML dataset so my lightning module pulls the data and formats it, since the formatting is deterministic, there is no need for the data to be pushed again