DeterminedCrab71 does the artifacts and results send to clearml ec2 server accessible to anyone or it would be perfectly private and confidential??
also it would be great if you could direct me to relevant documentation, where all of the atifacts and plots displayed in cleaml-server are stored to particular s3 bucket and all of the information displayed in the server comes from s3 bucket!!
Hello DepravedSheep68 ,
In order to store your info into the S3 bucket you will need two things :
specify the uri where you want to store your data when you initialize the task (search for the parameter output_uri in the Task.init function https://clear.ml/docs/latest/docs/references/sdk/task#taskinit ) specify your s3 credentials into the clear.conf file (what you did)
DepravedSheep68 , can you please give a bit more context on the error? Also can you show an example of your usage?
You can try the following:
Configure your ~/clearml.conf
with sdk.development.default_output_uri: "s3://<YOUR_BUCKET>"
and in code simply using Task.init(..., output_uri=True)
. Let's see if that setup works.
If I am doing port forwarding then I just need to add s3 bucket name right?? credentials and everything related to aws will be taken automatically!?
CostlyOstrich36 I mean whatever is sent to clearml-server and displayed there, I want to store it to s3 !!
I have removed the keys values just for security purpose
Which port should I add?? Of file_server i.e., 8081 ??
DeterminedCrab71 could you help me with this!
only you will have access to material upload to the SAAS files server, same security as the web and api has.
you can of course use your S3 bucket to store these resources
https://clear.ml/docs/latest/docs/integrations/storage#configuring-aws-s3
SweetBadger76 using output_uri="s3://......" in task.init gives me following error:ValueError: Invalid port ''.
Do I need to provide s3 bucket name or I could actually provide path for directory under s3 bucket?
SweetBadger76 thanks for heads up on that I will try that
DepravedSheep68 , do you mean when registering your data?
port forwarding is an unconventional setup, and inconvenient 1. you should configure a public address.
https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server_aws_ec2_ami has some info and links.
for a more secure option I'd follow this guide to create a user/password environment https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server_config/#web-login-authentication
and use AWS ELB to securely expose the 3 services though it.
you can then configure your client with the exposed addressweb_server: "http://[expossed web address]:8080" api_server: "http://[expossed API address]:8008" files_server: "http://[expossed file address]localhost:8081"
another simple option would be our hosted SAAS at https://app.clear.ml
Providing path to s3 bucket is not storing the artifacts and information in the s3 bucket!!
i guess so. make your tests and please keep us updated if you still encounter issues 🙂
DepravedSheep68 you could also try to add the port to your uri.
Output_uri: "s3://...... : port"
Or when running something and uploading to a s3?
yes, from the config
UI will popup a from for buckets' key and secret