Unanswered
Hi
I Am Creating A Custom Image Dataset And Does Anyone Know How To Upload Files Into Remote Uri (S3) Without Compressing The Data?
I think based on the src file dataset.py
the compression False is not possible
def upload(
self,
show_progress=True,
verbose=False,
output_url=None,
compression=None,
chunk_size=None,
max_workers=None,
retries=3,
):
# type: (bool, bool, Optional[str], Optional[str], int, Optional[int], int) -> ()
"""
Start file uploading, the function returns when all files are uploaded.
:param show_progress: If True, show upload progress bar
:param verbose: If True, print verbose progress report
:param output_url: Target storage for the compressed dataset (default: file server)
Examples: `
`, `
` , `
` , `/mnt/share/data`
:param compression: Compression algorithm for the Zipped dataset file (default: ZIP_DEFLATED)
:param chunk_size: Artifact chunk size (MB) for the compressed dataset,
if not provided (None) use the default chunk size (512mb).
If -1 is provided, use a single zip artifact for the entire dataset change-set (old behaviour)
:param max_workers: Numbers of threads to be spawned when zipping and uploading the files.
If None (default) it will be set to:
- 1: if the upload destination is a cloud provider ('s3', 'gs', 'azure')
- number of logical cores: otherwise
:param int retries: Number of retries before failing to upload each zip. If 0, the upload is not retried.
:raise: If the upload failed (i.e. at least one zip failed to upload), raise a `ValueError`
"""
98 Views
0
Answers
7 months ago
7 months ago