Tested with two sub folders, seems to work.
Could you please test with the latest RC:pip install clearml==0.17.5rc4
also will it always addproject/task.4c746400d4334ec7b389dd6232082313/artifacts/filename
I have a double sub_folder :output_uri='
s3://my_bucket/sub_folder/sub_sub_folder '
Hmm worked now...
When Task.init called with output_uri='
s3://my_bucket/sub_folder '
s3://my_bucket/sub_folder/examples/upload issue.4c746400d4334ec7b389dd6232082313/artifacts/test/test.json
I lost you SmallBluewhale13 is this the Task init call you used:task = Task.init( project_name="examples", task_name="load_artifacts", output_uri="s3://company-clearml/artifacts/bethan/sales_journeys/", )
This is how I wanted it to look:output_uri="s3://company-clearml/artifacts/bethan/sales_journeys/filename.csv.gz
It should have been:output_uri="s3://company-clearml/artifacts/bethan/sales_journeys/artifacts/examples/load_artifacts.f0f4d1cd5eb54795b11508dd1e739145/artifacts/filename.csv.gz/filename.csv.gz
So you are saying it ignored everything after the bucket's "/" ?
File final locations3://company-clearml/artifacts/examples/load_artifacts.f0f4d1cd5eb54795b11508dd1e739145/artifacts/filename.csv.gz/filename.csv.gz
Task Init commandtask = Task.init( project_name="examples", task_name="load_artifacts", output_uri="s3://company-clearml/artifacts/bethan/sales_journeys/", )
Upload artifact commandtask.upload_artifact( filename.csv.gz, artifact_object=df, delete_after_upload=True, wait_on_upload=False )
Thanks I actually did this "Task.init(..., upload_uri=' s3://my_bucket ')" and it didn't work.
SmallBluewhale13 the final path is automatically generated, you only need to specify the bucket itself. By default it will be your "files_server"
https://github.com/allegroai/clearml/blob/c58e8a4c6a1294f8acec6ed9cba81c3b91aa2abd/docs/clearml.conf#L10
You can either change the configuration (which will make sure All uploaded artificats will always be there, including debug images etc.)
You can specify where you want the artifacts and debug images to be uploaded by setting:
https://allegro.ai/clearml/docs/rst/references/clearml_python_ref/logger_module/logger_logger.html#clearml.logger.Logger.set_default_upload_destinationLogger.set_default_upload_destination('s3://my_bucket')
You can also specify it for the entire Task, which will trigger automatic upload of the models to the same bucket with:Task.init(..., upload_uri='s3://my_bucket')