Hello CostlyOstrich36 , thanks for your question. At the moment I am training a MLP for a regression problem and in one case I want to store the number of neurons per layer. Note that in my case it is not a hyperparameter because I calculate the number of neurons based on the number of layers and the number of model parameters. Another case is that I want to store some local paths where the models are stored, since I currently don't have any remote storage set up for my models.
Hi @<1526371965655322624:profile|NuttyCamel41> , first of all, when running your experiment with an agent, all output including console logs, is logged in the server (under the "console") tab.
You can also use the parameters so store such info, even if this is not a hyperparameter - just give the section a meaningful name and you can easily locate it in the UI
Hi @<1523701087100473344:profile|SuccessfulKoala55> , thanks for your message! 🙂 I am aware that the console is also logged on the server, but I somehow find it not optimal to look for relevant information in the console log and would like to place the information in a more structured way.
Hi @<1523701323046850560:profile|OutrageousSheep60> , thanks for your message as well. So far I have actually been using these exact functions until I noticed the following: when I run a task with these calls, everything works as expected. However, if I do a hyperparameter tuning and change some of the hyperparameters so that the additional information that is not a hyperparameter also changes, they are not adjusted. For better understanding again my concrete example: I have 3 parameters/infos 'number of layers', 'number of trainable model parameters' and 'number of neurons per layer'. The first two are hyperparameters and the last one is rather an additional information and not a hyperparameter, because it is calculated from the other two hyperparameters. If I now track all these three parameters with Task.connect(...) and vary the first two hyperparameters in a hyperparameter tuning, the last parameter remains unchanged, even if it is recalculated and re-conected in the code. Is this the expected behavior or am I doing something wrong?