Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
I Tried To Get Data From Dataset, But Agent Always Look On Localhost:8081. I Change Host In Clearml.Conf But Have Same Error. How Can I Change Host Of Clearml Fileserver?

I tried to get data from dataset, but agent always look on localhost:8081. I change host in clearml.conf but have same error. How can I change host of clearml fileserver?

  
  
Posted 10 months ago
Votes Newest

Answers 8


In the webUI, when you go to the dataset, where do you see it is saved? You can click on 'full details' in any version of a dataset and see that in the artifacts section

  
  
Posted 10 months ago

I am running clearml server on gcp, but I didn't exposed ports instead I ssh to machine and do port forwarding to localhost. The problem is localhost on my machine is not same as localhost inside docker on worker. If I check dataset, files are stored in localhost, but actually it is not localhost. Didn't fond the solution yet how to properly setup hostname for dataserver. Any ideas?

  
  
Posted 10 months ago

YummyGrasshopper29 , how did you save the dataset? Where was the data uploaded to?

  
  
Posted 10 months ago

Thanks, I will check!

  
  
Posted 10 months ago

I solve problem by adding container argument

--network host 
  
  
Posted 10 months ago

YummyGrasshopper29 , I suggest you take a look here - None

  
  
Posted 10 months ago

I have GCP instance with official clearml image.

from clearml import StorageManager, Dataset

dataset = Dataset.create(
    dataset_project="Project", dataset_name="Dataset_name"
)

files = [
    'file.csv',
    'file1.csv',
]

for file in files:
    csv_file = StorageManager.get_local_copy(remote_url=file)
    dataset.add_files(path=csv_file)


# Upload dataset to ClearML server (customizable)
dataset.upload()
# commit dataset changes
dataset.finalize()
  
  
Posted 10 months ago

Hi YummyGrasshopper29 , clearml registers the uploaded artifacts (including datasets) with the URLs used to upload them, which is why the day is registered under localhost in your case. I think the solution in your case is to use url substitution

  
  
Posted 10 months ago