Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
, This Is A Great Tool For Visualizing All Your Experiments. I Wanted To Know That When I Am Logging Scalar Plots With Title As Train Loss And Test Loss They Are Getting Diplayed As Train Loss And Test Loss In The Scalar Tab. I Wanted That The Title Shoul

AgitatedDove14 , this is a great tool for visualizing all your experiments. I wanted to know that when I am logging scalar plots with title as train loss and test loss they are getting diplayed as train loss and test loss in the scalar tab.
I wanted that the title should be loss and under that I should get these two differnet graphs train loss and test loss. Is this possible?
image

  
  
Posted 4 years ago
Votes Newest

Answers 68


You can do:
task = Task.get_task(task_id='uuid_of_experiment')
task.get_logger().report_scalar(...)

Now the only question is who will create the initial Task, so that the others can report to it. Do you have like a "master" process ?

  
  
Posted 4 years ago

so, like if validation loss appears then there will be three sub-tags under one main tag loss

  
  
Posted 4 years ago

I have one more question?

  
  
Posted 4 years ago

And you want all of them to log into the same experiment ? or do you want an experiment per 60sec (i.e. like the scheduler)

  
  
Posted 4 years ago

I have 100 experiments and I have to log them and update those experiments every 5 minutes

  
  
Posted 4 years ago

then if there are 10 experiments then I have to call Task.create() for those 10 experiments

  
  
Posted 4 years ago

like in the above picture

  
  
Posted 4 years ago

Just so I understand,
scheduler executes main every 60sec
main spins X sub-processes
Each subprocess needs to report scalars ?

  
  
Posted 4 years ago

like if u see in above image my project name is abcd18 and under that there are experiments Experiment1, Experiment2 etc.

  
  
Posted 4 years ago

so I want loss should be my main title and I want two different graphs of train and test loss under that loss

  
  
Posted 4 years ago

def combined(path,exp_name,project_name):
temp = Task.create(task_name="exp_name")

logger = temp.current_logger()
logger.report_scalar()

def main():
task=Task.init(project_name="test")
[pool.apply_async(combined, args = (row['Path'], row['exp_name'], row['project_name'])) for index,row in temp_df.iterrows()]

scheduler = BlockingScheduler()
scheduler.add_job(main, 'interval', seconds=60, max_instances=3)
scheduler.start()

  
  
Posted 4 years ago

logger.report_scalar("loss", "train", iteration=0, value=100)
logger.report_scalar("loss", "test", iteration=0, value=200)

  
  
Posted 4 years ago

yes But i want two graphs with title as train loss and test loss and they should be under main category "loss"

  
  
Posted 4 years ago

like in the sidebar there should be a title called "loss" and under that two different plots should be there named as "train_loss" and "test_loss"

  
  
Posted 4 years ago

its like main title will be loss

  
  
Posted 4 years ago

It will not create another 100 tasks, they will all use the main Task. Think of it as they "inherit" it from the main process. If the main process never created a task (i.e. no call to Tasl.init) then they will create their own tasks (i.e. each one will create its own task and you will end up with 100 tasks)

  
  
Posted 4 years ago

logger.report_scalar(title="loss", series="train", iteration=0, value=100)
logger.report_scalar(title="loss", series="test", iteration=0, value=200)

  
  
Posted 4 years ago

Before this line, call Task.init

  
  
Posted 4 years ago

and it should log it into the same task and same project

  
  
Posted 4 years ago

my scheduler will be running every 60 seconds and calling main function

  
  
Posted 4 years ago

then if there are 100 experiments how it will create 100 tasks?

  
  
Posted 4 years ago

and that function creates Task and log them

  
  
Posted 4 years ago

logger.report_scalar("loss-train", "train", iteration=0, value=100)
logger.report_scalar("loss=test", "test", iteration=0, value=200)
notice that the title of the graph is its uniue id, so if you send scalars to with the same "title" they will show on the same graph

  
  
Posted 4 years ago

image

  
  
Posted 4 years ago

Oh I got it.

  
  
Posted 4 years ago

Can my request be made as new feature so that we can tag same type of graphs under one main tag

  
  
Posted 4 years ago

and under that there will be three graphs with title as train test and loss

  
  
Posted 4 years ago

Can my request be made as new feature so that we can tag same type of graphs under one main tag

Sure, open a Git Issue :)

  
  
Posted 4 years ago

image

  
  
Posted 4 years ago

main will initialize parent task and then my multiprocessing occurs which call combined function with parameters as project_name and exp_name

  
  
Posted 4 years ago
74K Views
68 Answers
4 years ago
one year ago
Tags