Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey Guys, I Hope You All Have A Nice Days, I Had To Use The

Hey guys, i hope you all have a nice days, I had to use the Task method .setup_aws_upload(bucket=...,region=...) to overcome a incorrect region specified for bucket error which strangely is only triggered when trying to upload a Dataset while the artifact reporting to AWS works fine with the same bucket, but when doing so I stumble on that strange Invalid length for parameter Key, value: 0, valid min length: 1 error from botocore, if someone has some pointers on this one it would be great!

fraud-trainer-api | File "/code/./app/training.py", line 56, in train_model fraud-trainer-api | task.setup_aws_upload( fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/clearml/backend_interface/setupuploadmixin.py", line 102, in setup_aws_upload fraud-trainer-api | StorageHelper.add_aws_configuration(self._bucket_config, log=self.log) fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/clearml/storage/helper.py", line 489, in add_aws_configuration fraud-trainer-api | _Boto3Driver._test_bucket_config(bucket_config, log) # noqa fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/clearml/storage/helper.py", line 1786, in _test_bucket_config fraud-trainer-api | bucket.put_object(Key=filename, Body=six.b(json.dumps(data))) fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/boto3/resources/factory.py", line 580, in do_action fraud-trainer-api | response = action(self, *args, **kwargs) fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/boto3/resources/action.py", line 88, in __call__ fraud-trainer-api | response = getattr(parent.meta.client, operation_name)(*args, **params) fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/botocore/client.py", line 530, in _api_call fraud-trainer-api | return self._make_api_call(operation_name, kwargs) fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/botocore/client.py", line 919, in _make_api_call fraud-trainer-api | request_dict = self._convert_to_request_dict( fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/botocore/client.py", line 990, in _convert_to_request_dict fraud-trainer-api | request_dict = self._serializer.serialize_to_request( fraud-trainer-api | File "/usr/local/lib/python3.10/site-packages/botocore/validate.py", line 381, in serialize_to_request fraud-trainer-api | raise ParamValidationError(report=report.generate_report()) fraud-trainer-api | botocore.exceptions.ParamValidationError: Parameter validation failed: fraud-trainer-api | Invalid length for parameter Key, value: 0, valid min length: 1

  
  
Posted one year ago
Votes Newest

Answers 6


And does this always reproduce?

  
  
Posted one year ago

@<1523701087100473344:profile|SuccessfulKoala55> I had already bumped boto3 to its latest version and all the files I added to the dataset were pickle binary files

  
  
Posted one year ago

FierceHamster54 in all reports I've seen concerning a similar error, it seems the problem was when trying to copy a folder from/to a bucket without specifying the recursive option (some were with the aws cli, but I suspect this is similar). Since the setup_aws_upload() attempts to upload a file, I suspect there's some confusion and it tries somehow to upload a folder, but does that as if it was a file... the actual implementation is providing boto3 with a Key and a Body, which should be treated as a file (although the key has no suffix) - perhaps upgrading boto will help?

  
  
Posted one year ago

Okay, turns out the output_uri in the constuctor was overriding the .setup_aws_upload() and not the other way arround

  
  
Posted one year ago

Turns out the bucket param expected was expecting the bucket name without the s3:// protocol specification, but now that this issue is fixed i still have the same incorrect region specified error ,

task = Task.init( project_name='XXXX', task_name=f'Training-{training_uuid}', task_type=Task.TaskTypes.training, output_uri=f's3://{constants.CLEARML_BUCKET}' ) task.setup_aws_upload( bucket=constants.CLEARML_BUCKET, region=constants.AWS_REGION, )
Despite having overriden the region with the .setup_aws_upload() method

  
  
Posted one year ago

ClearML package version used: 1.9.1
ClearML Server: SaaS - Pro Tier

  
  
Posted one year ago
942 Views
6 Answers
one year ago
one year ago
Tags
Similar posts