Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I’M Trying To Integrate Logger In My Pipelinedecorator But I’M Getting This Error -

Hi, I’m trying to integrate Logger in my PipelineDecorator but I’m getting this error -

TypeError: cannot pickle '_thread.RLock' object
  
  
Posted 8 months ago
Votes Newest

Answers 15


Hi @<1678212417663799296:profile|JitteryOwl13> , can you please add a snippet that reproduces this?

  
  
Posted 8 months ago

ClearML results page: 

step 1
ClearML Monitor: GPU monitoring failed getting GPU reading, switching off GPU monitoring
Traceback (most recent call last):
  File "/var/folders/1c/vz6m8k653j1_xpmpfgynshc00000gn/T/tmp4myzvumt.py", line 61, in <module>
    task.upload_artifact(
  File "/Users/almoghitelman/.pyenv/versions/sunbit-ai-312/lib/python3.12/site-packages/clearml/task.py", line 2341, in upload_artifact
    raise exception_to_raise
  File "/Users/almoghitelman/.pyenv/versions/sunbit-ai-312/lib/python3.12/site-packages/clearml/task.py", line 2322, in upload_artifact
    if self._artifacts_manager.upload_artifact(
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/almoghitelman/.pyenv/versions/sunbit-ai-312/lib/python3.12/site-packages/clearml/binding/artifacts.py", line 749, in upload_artifact
    pickle.dump(artifact_object, f)
TypeError: cannot pickle '_thread.RLock' object
Launching the next 0 steps
Setting pipeline controller Task as failed (due to failed steps) !
pipeline completed
  
  
Posted 8 months ago

the first step init Logger and return it object

Logger.current_logger()
  
  
Posted 8 months ago

Hi @<1678212417663799296:profile|JitteryOwl13> ! Are you trying to return the logger from a step?

  
  
Posted 8 months ago

Hi @<1523701435869433856:profile|SmugDolphin23> , yes since I want to use it next steps

  
  
Posted 8 months ago

Each step is a separate task, with its own separate logger. You will not be able to reuse the same logger. Instead, you should get the logger in the step you want to use it calling current_logger

  
  
Posted 8 months ago

Thanks for the response, I tried to creat Logger.current_logger in each step but I got -

RuntimeError: can't create new thread at interpreter shutdown
  
  
Posted 8 months ago

How can we send objet from step1 to step2 without creating it again?

  
  
Posted 8 months ago

Moving objects between steps is usually done via the artifacts mechanism. How are you building the pipeline, with decorators?

  
  
Posted 8 months ago

yes

  
  
Posted 8 months ago

@PipelineDecorator.pipeline(name='Pipeline_Trail', project='Pipeline_Trail', version='0.1')
def main():
    _args = params.parse_args()
    setup_logger = init_experiment(_args)
    data = load_dataset(_args, setup_logger)
    train_model(_args, data, setup_logger)
  
  
Posted 8 months ago

I’m getting error although logger_setup not calling ClearML logger, it’s just an inner class we use. cam we pass object between steps? (our objects) @<1523701070390366208:profile|CostlyOstrich36> @<1523701435869433856:profile|SmugDolphin23>


ValueError: Could not retrieve a local copy of artifact setup_logger
  
  
Posted 8 months ago

Yes, passing custom object between steps should be possible. The only condition is for the objects to be pickleable. What are you returning exactly from init_experiment ?

  
  
Posted 8 months ago

util class that write logs to txt file and holds the paths for data,model etc. @<1523701435869433856:profile|SmugDolphin23>

  
  
Posted 8 months ago

Your object is likely holding some file descriptor or something like that. The pipeline steps are all running in separate processes (they can even run on different machines while running remotely). You need to make sure that the objects that you are returning are thus pickleable and can be passed between these processes. You can try to see that the logger you are passing around is indeed pickalable by calling pickle.dump(s) on it an then loading it in another run.
The best practice would be to have a separate logger for each step

  
  
Posted 8 months ago
587 Views
15 Answers
8 months ago
8 months ago
Tags