Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Hi Guys! I Have Great News, We Finally Fully Implemented Support For Continuing Previously Trained Models

Hi Guys!
I have great news, we finally fully implemented support for continuing previously trained models 🎉
Here is a quick example (this is torch, but any framework will work here):
Experiment A (stage 1):
from trains import Task task = Task.init(project_name='demo', task_name='train stage1', output_uri=' ` ')

some stuff

torch.save('model.pt') Experiment B (stage 2): from trains import Task
task = Task.init(project_name='demo', task_name='train stage2', output_uri=' ')
previous_task = Task.get_task(project_name='demo', task_name='train stage1')
local_model = previous_task.models['output'][-1].get_local_copy()

do some stuff

torch.save('model2.pt') Notice that I used output_uri ` , and pointed it to the Trains file server. This will make sure that I will automatically have a copy of all the stored models on the file server. This also means that Experiment B can be executed on any machine (e.g. Trains-Agent) , and it will download the model from the file server and open a local copy of the model.pt .
With the next Trains release, the model files will also be cached locally 🙂

Also notice that Experiment B will automatically have the output model of experiment A as its own input model, so we can trace back the model evolution :)

Posted 3 years ago
Votes Newest


0 Answers
3 years ago
10 months ago