Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Unanswered
Hey, Could You Help Me? I’Ve Tried Update Clearml-Server In K8S Old And New Clearml In The Different Namespaces, But After Migrate I Got The Error Error 101 : Inconsistent Data Encountered In Document: Document=Output, Field=Model How It Fix?


[2021-06-11 15:24:36,885] [9] [ERROR] [clearml.service_repo] Returned 500 for queues.get_next_task in 60007ms, msg=General data error: err=('1 document(s) failed to index.', [{'index': {'_index': 'queue_metrics_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'PkGr-3kBBPcUBw4n5Acx', 'status': 503, 'error': {'type':..., extra_info=[queue_metrics_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[queue_metrics_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [index {[queue_metrics_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][_doc][PkGr-3kBBPcUBw4n5Acx], source[_na_]}]] [2021-06-11 15:24:39,424] [9] [ERROR] [clearml.__init__] Failed processing worker status report Traceback (most recent call last): File "/opt/clearml/apiserver/bll/workers/__init__.py", line 149, in status_report machine_stats=report.machine_stats, File "/opt/clearml/apiserver/bll/workers/__init__.py", line 416, in _log_stats_to_es es_res = elasticsearch.helpers.bulk(self.es_client, actions) File "/usr/local/lib/python3.6/site-packages/elasticsearch/helpers/actions.py", line 396, in bulk for ok, item in streaming_bulk(client, actions, *args, **kwargs): File "/usr/local/lib/python3.6/site-packages/elasticsearch/helpers/actions.py", line 326, in streaming_bulk **kwargs File "/usr/local/lib/python3.6/site-packages/elasticsearch/helpers/actions.py", line 246, in _process_bulk_chunk for item in gen: File "/usr/local/lib/python3.6/site-packages/elasticsearch/helpers/actions.py", line 185, in _process_bulk_chunk_success raise BulkIndexError("%i document(s) failed to index." % len(errors), errors) elasticsearch.helpers.errors.BulkIndexError: ('8 document(s) failed to index.', [{'index': {'_index': 'worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'P0Gr-3kBBPcUBw4n7gce', 'status': 503, 'error': {'type': 'unavailable_shards_exception', 'reason': '[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [8] requests]'}, 'data': {'timestamp': 1623417920000, 'worker': 'test:bd28:cpu:2', 'company': 'clearml', 'task': None, 'category': 'cpu', 'metric': 'cpu_temperature', 'variant': '0', 'value': 43.0}}}, {'index': {'_index': 'worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'QEGr-3kBBPcUBw4n7gce', 'status': 503, 'error': {'type': 'unavailable_shards_exception', 'reason': '[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [8] requests]'}, 'data': {'timestamp': 1623417920000, 'worker': 'test:bd28:cpu:2', 'company': 'clearml', 'task': None, 'category': 'cpu', 'metric': 'cpu_usage', 'variant': '0', 'value': 3.334}}}, {'index': {'_index': 'worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'QUGr-3kBBPcUBw4n7gce', 'status': 503, 'error': {'type': 'unavailable_shards_exception', 'reason': '[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [8] requests]'}, 'data': {'timestamp': 1623417920000, 'worker': 'test:bd28:cpu:2', 'company': 'clearml', 'task': None, 'category': 'disk', 'metric': 'disk_free_home', 'variant': 'total', 'value': 58.1}}}, {'index': {'_index': 'worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'QkGr-3kBBPcUBw4n7gce', 'status': 503, 'error': {'type': 'unavailable_shards_exception', 'reason': '[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [8] requests]'}, 'data': {'timestamp': 1623417920000, 'worker': 'test:bd28:cpu:2', 'company': 'clearml', 'task': None, 'category': 'disk', 'metric': 'disk_write', 'variant': 'total', 'value': 0.009}}}, {'index': {'_index': 'worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'Q0Gr-3kBBPcUBw4n7gce', 'status': 503, 'error': {'type': 'unavailable_shards_exception', 'reason': '[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [8] requests]'}, 'data': {'timestamp': 1623417920000, 'worker': 'test:bd28:cpu:2', 'company': 'clearml', 'task': None, 'category': 'memory', 'metric': 'memory_free', 'variant': 'total', 'value': 113848.816}}}, {'index': {'_index': 'worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'REGr-3kBBPcUBw4n7gce', 'status': 503, 'error': {'type': 'unavailable_shards_exception', 'reason': '[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [8] requests]'}, 'data': {'timestamp': 1623417920000, 'worker': 'test:bd28:cpu:2', 'company': 'clearml', 'task': None, 'category': 'memory', 'metric': 'memory_used', 'variant': 'total', 'value': 13401.186}}}, {'index': {'_index': 'worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'RUGr-3kBBPcUBw4n7gce', 'status': 503, 'error': {'type': 'unavailable_shards_exception', 'reason': '[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [8] requests]'}, 'data': {'timestamp': 1623417920000, 'worker': 'test:bd28:cpu:2', 'company': 'clearml', 'task': None, 'category': 'network', 'metric': 'network_rx', 'variant': 'total', 'value': 0.001}}}, {'index': {'_index': 'worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06', '_type': '_doc', '_id': 'RkGr-3kBBPcUBw4n7gce', 'status': 503, 'error': {'type': 'unavailable_shards_exception', 'reason': '[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0] primary shard is not active Timeout: [1m], request: [BulkShardRequest [[worker_stats_d1bd92a3b039400cbafc60a7a5b1e52b_2021-06][0]] containing [8] requests]'}, 'data': {'timestamp': 1623417920000, 'worker': 'test:bd28:cpu:2', 'company': 'clearml', 'task': None, 'category': 'network', 'metric': 'network_tx', 'variant': 'total', 'value': 0.001}}}]) [2021-06-11 15:24:39,426] [9] [ERROR] [clearml.service_repo] Returned 500 for workers.status_report in 60008ms, msg=General data error (Failed processing worker status report): err=8 document(s) failed to index.

  
  
Posted 3 years ago
162 Views
0 Answers
3 years ago
one year ago