Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Am Looking For A "Hello World" Example To Do 3 Tasks: Data=Preprocessdata() //Local Processing Model=Trainmodel(Data) // On Cloud Processing On A Custom Container Say Using Pytorch Or Keras Results=Evaluatemodel(Model, Testdata) //Local Processing

I am looking for a "hello world" example to do 3 Tasks:
data=preprocessData() //local processing
model=trainModel(data) // on cloud processing on a custom container say using pytorch or keras
results=evaluateModel(model, testdata) //local processing

  
  
Posted 3 years ago
Votes Newest

Answers 3


LazyLeopard18 you can point the artifact directly to your azure object storage and have StorageManager download and cache it for you:

  
  
Posted 3 years ago

I have ~100GB of data that I do not wish to upload to the trains-server. Instead, I would like to have them copied only to host machine (azure container) at training time.
The data is in Azure blob storage and will be copied using a custom script just before training starts.

  
  
Posted 3 years ago
639 Views
3 Answers
3 years ago
one year ago
Tags