Let me know, if this still doesn't work, I'll try to reproduce your issue π
I'm running 1.7.0 (latest docker available).
Your example did work for me, but I'm gonna try the flush()
method now
Hey GrotesqueDog77
A few things, first you can call _logger.flush() which should solve the issue you're seeing (We are working to add auto-flushing when tasks end π )
Second, I ran this code and it works for me without a sleep, does it also work for you?
` from clearml import PipelineController
def process_data(inputs):
import pandas as pd
from clearml import PipelineController
data = {'Name': ['Tom', 'nick', 'krish', 'jack'],
'Age': [20, 21, 19, 18]}
_logger = PipelineController.get_logger()
df = pd.DataFrame(data)
_logger.report_table('Awesome', 'Details', table_plot=df)
pipeline = PipelineController(name='erez', project='erez',version='0.1')
pipeline.add_function_step(name='process_data', function=process_data,
cache_executed_step=True)
pipeline.start_locally(run_pipeline_steps_locally=True) `What SDK vesrion are you using? I'm using V1.7.1. I also didn't pass the data as input so it might affect, but I'd be happy if you can give it a try
CostlyOstrich36 maybe you have any idea why this code might not work for me?
When I add sleep
to the process_data
it works if it was enough time to upload data
def process_data(inputs): import time import pandas as pd from clearml import PipelineController _logger = PipelineController.get_logger() df = pd.DataFrame.create(inputs) _logger.report_table('Awesome', 'Details', table_plot=df) time.sleep(10)
Hi CostlyOstrich36
Here is the code example which does not work for me
` def process_data(inputs):
import pandas as pd
from clearml import PipelineController
_logger = PipelineController.get_logger()
df = pd.DataFrame.create(inputs)
_logger.report_table('Awesome', 'Details', table_plot=df)
pipeline = PipelineController(name='best_pipeline', project='test')
pipeline.add_function_step(name='process_data', function=process_data,
function_kwargs=dict(inputs=some_data),
cache_executed_step=True)
pipeline.add_function_step(name='next', function=next,
function_kwargs=(something="${process_data.ouput}")
pipeline.start_locally() `
Hi GrotesqueDog77 ,
Can you please add a small code snippet of this behavior? Is there a reason you're not reporting this from within the task itself or to the controller?