Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Hi, I Am Trying To Update The Aws_Autoscaler To The Latest Version On The Master Branch. I Simply Changed The Commit Id In The Experiment And Run It, This Gave Me The Following Error:

Hi, I am trying to update the aws_autoscaler to the latest version on the master branch. I simply changed the commit id in the experiment and run it, this gave me the following error:
Traceback (most recent call last): File "aws_autoscaler.py", line 297, in <module> main() File "aws_autoscaler.py", line 84, in main configurations.update(json.loads(task.get_configuration_object(name="General") or "{}")) File "/data/shared/miniconda3/lib/python3.8/json/__init__.py", line 357, in loads return _default_decoder.decode(s) File "/data/shared/miniconda3/lib/python3.8/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/data/shared/miniconda3/lib/python3.8/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)it looks like it cannot parse the configuration object General . On a previous version (commit 34c41cfc8c3419e06cd4ac954e4b23034667c4d9 ), the same configuration works well. Is there something to adapt?

Posted 2 years ago
Votes Newest

Answers 3

Can you share the configuration? It seems like a basic JSON parsing error, not something likely to be affected by changed to the autoscaler code

Posted 2 years ago

Indeed, I actually had the old configuration that was not JSON - I converted to json, now works 🙂

Posted 2 years ago


Posted 2 years ago
3 Answers
2 years ago
one year ago
Similar posts