Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Know I Can Run This Manually In Step By Step But Wondering If This Can Be Automated As Scheduled Tasks

i know i can run this manually in step by step but wondering if this can be automated as scheduled tasks

  
  
Posted 3 years ago
Votes Newest

Answers 8


in the above example task id is from a newly generated task like Task.init() ?

  
  
Posted 3 years ago

as if its couple of weeks away.. i can wait

  
  
Posted 3 years ago

this looks good... also do you have any info/eta on next controller/service release you mentioning

  
  
Posted 3 years ago

DAG which get scheduled at given interval and

Yes exactly what will be part of the next iteration of the controller/service

an example achieving what i propose would be greatly helpful

Would this help?
from trains.automation import TrainsJob job = TrainsJob(base_task_id='step1_task_id_here') job.launch(queue_name='default') job.wait() job2 = TrainsJob(base_task_id='step2_task_id_here') job2.launch(queue_name='default') job2.wait()

  
  
Posted 3 years ago

an example achieving what i propose would be greatly helpful

  
  
Posted 3 years ago

looking at the above link, it seems i might be able to create it with some boilerplate as it has concept of parent and child... but not sure how status checks and dependency get sorted out

  
  
Posted 3 years ago

not so sure.. ideally i was looking for some function calls which enables me to create a sort of DAG which get scheduled at given interval and DAG has status checks on up streams task ... so if upstream task fails.. downstream tasks are not run

  
  
Posted 3 years ago

Hi PompousParrot44
You can check the cleanup service example.
It sleeps for 24 hours then spins up and does its thing.
You can always launch this service tasks on the services queue, its purpose is to run those services on the trains-server as additional CPU services. They will also be registered as service nodes, so you have visibility into which service is running.
In order to clone a task and wait for its completion.
Use the TrainsJob https://github.com/allegroai/trains/blob/65a4aa7aa90fc867993cf0d5e36c214e6c044270/trains/automation/job.py#L14

We will have a better example for running your own controllers service in a few weeks, but it should not be hard to write.

Is this what you were looking for?

  
  
Posted 3 years ago