Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Community! I Use The Tensorflow Object Detection Api And Trying To Get "Pretrained Checkpoint" Input Model To Store On The Server. If I Set Output_Uri=True In Task.Init, I Get The Output Models Stored On The Server As Expected But The Input Model Still

Hi community!
I use the Tensorflow object detection API and trying to get "pretrained checkpoint" input model to store on the server. If I set output_uri=True in Task.init, I get the output models stored on the server as expected but the input model still logs my local url as "model_url" variable.
Also, if i log the labels to my task, they are also associated with the input model which they are not supposed to in this case. How can I connect them only with the output models?
The only code added to my training script is:
task = Task.init(project_name='MyProject', task_name='tf_efficientdet_d0_test', output_uri=True)and
task.connect_label_enumeration(dict_from_labelmap)

  
  
Posted 2 years ago
Votes Newest

Answers 2


Yes, I have tried setting the output_uri to different locations but the input model_url is always saving the local file path, like this for example:
file:///C:\Users\JK\Documents\ClearML\models\efficientdet_d0\v0\ckpt-0while the output model gets a path on the server
Maybe it is supposed to be this way, but if I clone this experiment, I have no way of reusing this input model when I use a remote agent.

  
  
Posted 2 years ago

Hi 🙂

Regarding the input issue - Try defining in your ~/clearml.conf the following: sdk.development.default_output_uri to wherever you want it uploaded. I'm guessing that when you're running the original input model is created through the script and downloaded?

Regarding tagging - I think you need to connect tags individually to output models if you wanna connect it only to outputs

  
  
Posted 2 years ago
626 Views
2 Answers
2 years ago
one year ago
Tags
Similar posts