Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Guys, I Keep Receiving A Timeout Error:

Hi guys,
I keep receiving a timeout error:
Retrying (Retry(total=234, connect=240, read=234, redirect=240, status=240)) after connection broken by 'ReadTimeoutError("HTTPConnectionPool(host='107.21.117.186', port=8008): Read timed out. (read timeout=300.0)")': /v2.13/events.add_batch
I have read few threads discussing this issue but I didnt find any solutions. Does someone have a solution for this issue?
Thanks in advance! 🙂

  
  
Posted 2 years ago
Votes Newest

Answers 4


Hi VexedPeacock35
can you share some more precisions about what occurs ?
What are you trying to do (or to be precise when does this error appeared ?)
What are your packages versions (clearml, and server if you are self-hosted)

  
  
Posted 2 years ago

SweetBadger76
When running a training script and logging result into a task in clearml. I am using the command:
Task.init(project_name=project_name, task_name=task_name)
The error is not immediate, it occurs from time to time during the training and the graphs on clearml are incomplete (to say the least).
Clearml version: 1.6.2
Server version: 1.2.0-153

  
  
Posted 2 years ago

Hi VexedPeacock35 , I suspect that Elasticsearch works too hard and periodically misses timeouts on recording events. How much memory and CPU is it using? Can you increase the memory that is allocated to it and see whether this helps?

  
  
Posted 2 years ago

LazyFish41
Thanks for the help. From your message I realized maybe the problem is that the folder in which I save the records is too large already, which was in fact the problem.
When I emptied the folder the error did not appear again.
Thanks for all the help 🙂

  
  
Posted 2 years ago
989 Views
4 Answers
2 years ago
one year ago
Tags