Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Hello! I Have A Problem:

Hello! I have a problem: ValueError: Could not download dataset whaen getting local copy. The dataset was uploaded on a /mnt share. The thing is that when trying this out I made a second test dataset with a small subsample of files but exactly same upload script and I am able to get local copy of that dataset, the larger dataset is ~64GB compressed. Anyways, any help or suggested ways of debugging would be much appreciated.

Posted one year ago
Votes Newest

Answers 4

Hi DeliciousKoala34 , is there also an exceptionally large amount of files in that Dataset? How do you create the dataset? What happens if you use something like s3 if you have available?

Posted one year ago

Are datasets of size 90GB considered HyperDatasets?

Posted one year ago

It happened again. get_local_copy() worked as expected, but then when I tried:
.get_mutable_local_copy(local_data_path, overwrite=True, raise_on_error=False)contents of evry 'data' folder on the share were deleted and the same error was displayed.

Posted one year ago

No, the samll test dataset has only 32MB. I created the dataset by using Dataset.create(...) datasset.add_files(...) and then dataset.finalize() . I unfortunately dont have s3. I poked around in the saved data on the share and it seems that for some reaseon folders 'data' to 'data_11' have their contents deleted. Whats even weirder is that they were deleted right at the time when i first tried to get a mutable copy today, the other folders are untouched since monday when i created the dataset. I will remake the dataset again, but any ideas why this happened?

Posted one year ago
4 Answers
one year ago
one year ago