Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi All! I Was Just Wondering What Is The Best Way To Log Additional Information? Right Now I'M Only Printing It To The Console, But That'S Not The Most Pleasant Way To Retrieve The Information Later On. As Far As I Can See, The 'Logger.Report_Text(...)' M

Hi all! I was just wondering what is the best way to log additional information? Right now I'm only printing it to the console, but that's not the most pleasant way to retrieve the information later on. As far as I can see, the 'logger.report_text(...)' method also only outputs the information to the console. I would prefer to have the information displayed on the 'INFO' tab, but I couldn't find a way to do that. I would really like to hear your opinions on that.

  
  
Posted one year ago
Votes Newest

Answers 6


Hi @<1523701323046850560:profile|OutrageousSheep60> , thanks for your message as well. So far I have actually been using these exact functions until I noticed the following: when I run a task with these calls, everything works as expected. However, if I do a hyperparameter tuning and change some of the hyperparameters so that the additional information that is not a hyperparameter also changes, they are not adjusted. For better understanding again my concrete example: I have 3 parameters/infos 'number of layers', 'number of trainable model parameters' and 'number of neurons per layer'. The first two are hyperparameters and the last one is rather an additional information and not a hyperparameter, because it is calculated from the other two hyperparameters. If I now track all these three parameters with Task.connect(...) and vary the first two hyperparameters in a hyperparameter tuning, the last parameter remains unchanged, even if it is recalculated and re-conected in the code. Is this the expected behavior or am I doing something wrong?

  
  
Posted one year ago

Hi @<1526371965655322624:profile|NuttyCamel41> , first of all, when running your experiment with an agent, all output including console logs, is logged in the server (under the "console") tab.
You can also use the parameters so store such info, even if this is not a hyperparameter - just give the section a meaningful name and you can easily locate it in the UI

  
  
Posted one year ago

Hi @<1523701087100473344:profile|SuccessfulKoala55> , thanks for your message! 🙂 I am aware that the console is also logged on the server, but I somehow find it not optimal to look for relevant information in the console log and would like to place the information in a more structured way.

  
  
Posted one year ago

try
None
or
None

  
  
Posted one year ago

Hello CostlyOstrich36 , thanks for your question. At the moment I am training a MLP for a regression problem and in one case I want to store the number of neurons per layer. Note that in my case it is not a hyperparameter because I calculate the number of neurons based on the number of layers and the number of model parameters. Another case is that I want to store some local paths where the models are stored, since I currently don't have any remote storage set up for my models.

  
  
Posted one year ago

Hi NuttyCamel41 , what kind of additional information are you looking to report? What is your use case?

  
  
Posted one year ago