Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey There, I Moved The Clearml S3 Bucket Where I Stored All My Clearml Data From One S3 Bucket To Another And Now I Realized That All The Models/Experiments Logged In The Clearml-Server Still Refer To The Old S3 Bucket. Is There A Way To Update All The Re

Hey there, I moved the clearml s3 bucket where I stored all my clearml data from one s3 bucket to another and now I realized that all the models/experiments logged in the clearml-server still refer to the old s3 bucket. Is there a way to update all the references to the old s3 bucket to the new one?

  
  
Posted 3 years ago
Votes Newest

Answers 10


Well, in case you do want to try it out:

For models, assuming your original URLs start with s3://<old-bucket-name>/ , you'll need a JS script, something like:
db.model.find({uri:{$regex:/^s3/}}).forEach(function(e,i) { e.uri = e.uri.replace("s3://<old-bucket-name>/","s3://<new-bucket-name>/"); db.model.save(e); });
And than:
` sudo docker exec -it clearml-mongo /bin/bash

paste script into this file and exit nano

nano script.js

this should run the script against the backend db

mongo backend script.js `
Please take care : since I'm not in front of computer right now, I can't test it, so please back up your data before trying it , and let me know if you encounter any errors...

  
  
Posted 3 years ago

Yes, I would like to update all references to the old bucket unfortunately… I think I’ll simply delete the old s3 bucket, wait or his name to be available again and recreate it where on the other aws account and move the data there. This way I don’t have to mess with clearml data - I am afraid to do something wrong and loose data

  
  
Posted 3 years ago

Thanks a lot for the solution SuccessfulKoala55 ! I’ll try that if the solution “delete old bucket, wait for its name to be available, recreate it with the other aws account, transfer the data back” fails

  
  
Posted 3 years ago

Hi SuccessfulKoala55 , will I be able to update all references to the old s3 bucket using this command?

  
  
Posted 3 years ago

Are you interested in updating both model URLs and debug Image URLs?

  
  
Posted 3 years ago

Hi JitteryCoyote63 , this can be done using an ES update_by_query command. To so that, you'll have to send the command (using curl for example) to the ES service (externally, if the 9200 port is open for external access), or by executing bash inside the ES container using sudo docker exec -it clearml-elasticsearch /bin/bash `

  
  
Posted 3 years ago

JitteryCoyote63 my previous comment was actually related to the events (debug images etc.) logged by your experiment

  
  
Posted 3 years ago

For models, you can connect to the MongoDB service (same command, but using the clearml-mongo container), and issue a manual mongo command

  
  
Posted 3 years ago

But please do back up your data before trying it 🙂

  
  
Posted 3 years ago

If you want the appropriate instructions for debug images, just let me know 🙂

  
  
Posted 3 years ago
610 Views
10 Answers
3 years ago
one year ago
Tags
Similar posts