CostlyOstrich36 tensorflow reporting it, ClearML capture it, and I get it with that function.
If I'm not mistaken Task.get_last_iteration()
https://clear.ml/docs/latest/docs/references/sdk/task#get_last_iteration
reports the last iteration that was reported. However someone has to report that iteration. You either have to report it manually yourself during the script OR have something else like tensorflow/tensorboard do that reporting and ClearML should capture it
Does it make sense?
I have nowhere else to bring iteration number ..
CostlyOstrich36 Not sure I understood, the current iterations come from the function
task.get_last_iteration() ...
Yeah, but how are iterations marked in the script?
CostlyOstrich36 More precisely, My function only calculates the accuracy as I defined it.
I passing the accuracy to logger.report_scalar by
logger.report_scalar(title='evaluate', series='score', value=my_acc, iteration=task.get_last_iteration())
CostlyOstrich36 I get the last iteration by task.get_last_iteration()
I want to report each iteration..
Is your function taking into account iterations? How are iterations moved along? Do you attempt this scalar report on every iteration or only once in the script?
CostlyOstrich36 When I do
logger = task.get_logger()
logger.report_scalar(title='evaluate', series='score', value=5, iteration=task.get_last_iteration())
train(model_dir=trained_model_dst, pipeline_config_path=pipeline_config_path, save_checkpoints_steps=args.checkpoints)
It only captures the first iteration...
Hmmm I think it should work, give it a try 🙂
Or do I have to dive into the code in train function and write the code there?
logger = task.get_logger()
train(model_dir=trained_model_dst, pipeline_config_path=pipeline_config_path, save_checkpoints_steps=args.checkpoints)
logger.report_scalar(title='evaluate', series='score', value=5, iteration=task.get_last_iteration())
Yes that's exactly what i do, But I'm trying to figure out if I can write down the line of code
logger.report_scalar(title='evaluate', series='score', value=5, iteration=task.get_last_iteration())
anywhere in the code?
Does the line of code open up another process parallel to training?
To report scalars manually (due to your custom function) you can use the following:
https://clear.ml/docs/latest/docs/references/sdk/logger#report_scalar
You also have a nice example here for usage 🙂
https://github.com/allegroai/clearml/blob/master/examples/reporting/scalar_reporting.py
CostlyOstrich36 I have my own function that gives an estimate of performance, and I want to display it in the graph of each iteration.
And I am using tensorflow
CostlyOstrich36 having the reported tensorflow scalars show up on ClearML
Do you mean reporting scalars with tensorflow OR having the reported tensorflow scalars show up on ClearML?