Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey, I'M Probably Being Thick Here But I Would Like To Pull Some Data From A Database And Write It To A Particular Bucket In S3 Within A Task I'M Doing. I'M Using Task.Upload_Artifact But Can'T Understand Where I Write The Bucket Path.

Hey, I'm probably being thick here but I would like to pull some data from a database and write it to a particular bucket in s3 within a task i'm doing. I'm using task.upload_artifact but can't understand where I write the bucket path.

  
  
Posted 3 years ago
Votes Newest

Answers 16


Tested with two sub folders, seems to work.
Could you please test with the latest RC:
pip install clearml==0.17.5rc4

  
  
Posted 3 years ago

also will it always add
project/task.4c746400d4334ec7b389dd6232082313/artifacts/filename

  
  
Posted 3 years ago

I have a double sub_folder :
output_uri=' s3://my_bucket/sub_folder/sub_sub_folder '

  
  
Posted 3 years ago

File final location
s3://company-clearml/artifacts/examples/load_artifacts.f0f4d1cd5eb54795b11508dd1e739145/artifacts/filename.csv.gz/filename.csv.gzTask Init command
task = Task.init( project_name="examples", task_name="load_artifacts", output_uri="s3://company-clearml/artifacts/bethan/sales_journeys/", )Upload artifact command
task.upload_artifact( filename.csv.gz, artifact_object=df, delete_after_upload=True, wait_on_upload=False )

  
  
Posted 3 years ago

It should have been:
output_uri="s3://company-clearml/artifacts/bethan/sales_journeys/artifacts/examples/load_artifacts.f0f4d1cd5eb54795b11508dd1e739145/artifacts/filename.csv.gz/filename.csv.gz

  
  
Posted 3 years ago

Thanks I actually did this "Task.init(..., upload_uri=' s3://my_bucket ')" and it didn't work.

  
  
Posted 3 years ago

So you are saying it ignored everything after the bucket's "/" ?

  
  
Posted 3 years ago

What was the url you ended up with ?

  
  
Posted 3 years ago

It should have ....

  
  
Posted 3 years ago

I lost you SmallBluewhale13 is this the Task init call you used:
task = Task.init( project_name="examples", task_name="load_artifacts", output_uri="s3://company-clearml/artifacts/bethan/sales_journeys/", )

  
  
Posted 3 years ago

Hmm worked now...
When Task.init called with output_uri=' s3://my_bucket/sub_folder '
s3://my_bucket/sub_folder/examples/upload issue.4c746400d4334ec7b389dd6232082313/artifacts/test/test.json

  
  
Posted 3 years ago

This is how I wanted it to look:
output_uri="s3://company-clearml/artifacts/bethan/sales_journeys/filename.csv.gz

  
  
Posted 3 years ago

Hmm, let me check something

  
  
Posted 3 years ago

Okay, let me check soemthing

  
  
Posted 3 years ago

yes

  
  
Posted 3 years ago

SmallBluewhale13 the final path is automatically generated, you only need to specify the bucket itself. By default it will be your "files_server"
https://github.com/allegroai/clearml/blob/c58e8a4c6a1294f8acec6ed9cba81c3b91aa2abd/docs/clearml.conf#L10
You can either change the configuration (which will make sure All uploaded artificats will always be there, including debug images etc.)
You can specify where you want the artifacts and debug images to be uploaded by setting:
https://allegro.ai/clearml/docs/rst/references/clearml_python_ref/logger_module/logger_logger.html#clearml.logger.Logger.set_default_upload_destination
Logger.set_default_upload_destination('s3://my_bucket')
You can also specify it for the entire Task, which will trigger automatic upload of the models to the same bucket with:
Task.init(..., upload_uri='s3://my_bucket')

  
  
Posted 3 years ago
598 Views
16 Answers
3 years ago
one year ago
Tags