like if u see in above image my project name is abcd18 and under that there are experiments Experiment1, Experiment2 etc.
and that function creates Task and log them
but this gives the results in the same graph
logger.report_scalar(title="loss", series="train", iteration=0, value=100)
logger.report_scalar(title="loss", series="test", iteration=0, value=200)
i mean all 100 experiments in one project
so, like if validation loss appears then there will be three sub-tags under one main tag loss
I have to create a main task for example named as main
Create one experiment (I guess in the scheduler)
task = Task.init('test', 'one big experiment')
Then make sure the the scheduler creates the "main" process as subprocess, basically the default behavior)
Then the sub process can call Task.init and it will get the scheduler Task (i.e. it will not create a new task). Just make sure they all call Task init with the same task name and the same project name.
def combined(path,exp_name,project_name):
temp = Task.create(task_name="exp_name")
logger = temp.current_logger()
logger.report_scalar()
def main():
task=Task.init(project_name="test")
[pool.apply_async(combined, args = (row['Path'], row['exp_name'], row['project_name'])) for index,row in temp_df.iterrows()]
scheduler = BlockingScheduler()
scheduler.add_job(main, 'interval', seconds=60, max_instances=3)
scheduler.start()
This code gives me the graph that I displayed above
so , it will create a task when i will run it first time
and it should log it into the same task and same project
each subprocess logs one experiment as task
If you one each "main" process as a single experiment, just don't call Task.init in the scheduler
Like here in the sidebar I am getting three different plots named as loss, train_loss and test_loss
okay, Thanks @<1523701205467926528:profile|AgitatedDove14> for the help.
And you want all of them to log into the same experiment ? or do you want an experiment per 60sec (i.e. like the scheduler)
yes But i want two graphs with title as train loss and test loss and they should be under main category "loss"
Hi @<1523701205467926528:profile|AgitatedDove14> , I wanted to ask you something. Is it possible that we can talk over voice somewhere so that I can explain my problem better?
Are you using tensorboard or do you want to log directly to trains ?
main will initialize parent task and then my multiprocessing occurs which call combined function with parameters as project_name and exp_name