Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey Guys, In Your Opinion, What The Best Way To Upload An Artifact To An Existing Experiment From A Storage-Server (E.G., S3)? In The Storage Module Documentation, I Saw A Function That Uploads An Object (E.G., Dataframe) To The Storage-Server, And It Is

Hey guys,
In your opinion, what the best way to upload an artifact to an existing experiment from a storage-server (e.g., S3)? In the storage module documentation, I saw a function that uploads an object (e.g., DataFrame) to the storage-server, and it is not exactly what I need. I will prefer direct transfer S3->trains-server without a mediator.
Thanks!

  
  
Posted 3 years ago
Votes Newest

Answers 8


TimelyPenguin76 Wonderful!

  
  
Posted 3 years ago

TimelyPenguin76 Thanks! To make sure I understand it correctly, now I can do something like:
task.upload_artifact('my_name',<S3-ADDRESS>)
And when it runs, it will navigate in the S3 storage to find the exact file.
Am I get it right?

  
  
Posted 3 years ago

TimelyPenguin76 I am using Task.Create() function and not Task.init(). The question is: can I change the output_uri member by calling the output_uri setter like:
task = Task.create()
task.output_uri(my_output_uri)

  
  
Posted 3 years ago

SpotlessFish46 It should work 🙂

  
  
Posted 3 years ago

Hi SpotlessFish46 ,
Is the artifact already in S3 ?
Is the S3 configured as the default files_server in the trains.conf ?
You can always use the StorageManager upload to wherever and register the url on the artifacts.
You can also programmatically change the artifact destination server to S3, then upload the artifact as usual.
What would be the best natch for you?

  
  
Posted 3 years ago

For this you don’t really need the output_uri , you can just do it as is.

  
  
Posted 3 years ago

AgitatedDove14 Yes, the artifact is already in S3. The best match for me is programmatically changing the artifact destination server to S3 and uploading it as usual. How I change the artifact destination of a specific experiment?

  
  
Posted 3 years ago

SpotlessFish46 You can change models and artifacts destination per experiment with output_uri https://github.com/allegroai/trains/blob/b644ec810060fb3b0bd45ff3bd0bce87f292971b/trains/task.py#L283 , can this work for you?

  
  
Posted 3 years ago
529 Views
8 Answers
3 years ago
one year ago
Tags