Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello ! I Have Been Using Clearml To Log My Pytorch Experiments, And Now I Would Like To Try To Also Use The Clearml Agent To Execute This Job Remotely. From The Doc, I Understand That The Main Configuration For This Is To Execute The Task Locally, Then C

Hello ! I have been using ClearML to log my pytorch experiments, and now I would like to try to also use the ClearML agent to execute this job remotely. From the doc, I understand that the main configuration for this is to execute the task locally, then clone it and queue it. Since I am running my experiments in a docker container, it looks like this fails since the tasks created from that run only contains the main script, but none of the relative imports. From a previous issue, I think this is because the container does not contain the git repo, only the code.

  • Is there a way for me to create a task using this container ?
  • If I unpack my code to run outside the container, how do I deal with credentials and relative files ?- i.e. git and AWS credentials
  • datasets that were stored outside the git repo
  • pypi packages that were installed from private git repos
  
  
Posted one year ago
Votes Newest

Answers 4


okay cool, I'm currently trying to migrate our stack to run from the git repository and using ClearML Datasets. I am still having an issue with relative imports in python, we were previously modifying PYTHONPATH in the container, but now I need to modify it manually on the host. I saw there is some documentation about that here , but I'm not sure I understand that correctly since it does not look like it is getting picked up by the task

  
  
Posted one year ago

Hi @<1644147961996775424:profile|HurtStarfish47> , to handle this, you'll need to fun your code from inside your cloned git repository folder, and the ClearML SDK will auto detect it and log it on the task. When the ClearML Agent will run your code inside a container, it will clone the repository and install any required dependencies.
AWS credentials can be configured in the agent's clearml.conf file (and it will pass them on to the task).
Any data obtained from outside the code can either be obtained dynamically by your code (i.e. downloaded from somewhere), or you can simply make sure it already exists inside the docker container - of course, you can simply use ClearML Datasets to store the dataset, and than get a local copy of it using your code.
As for pypi packages from private git repos, you can either add a direct reference to the repo in a requirements.txt file in your repo (a link complete with credentials), or have them preinstalled on the main system python inside the docker container.

  
  
Posted one year ago

This just related to logging. In general, when trying to do a proper remote execution setup relative imports from patched python paths are never stable and not recommended - I really think using a well-structured git repository (including submodules, if required) is the way to go

  
  
Posted one year ago

Great, thanks a lot for the help !

  
  
Posted one year ago
685 Views
4 Answers
one year ago
one year ago
Tags
Similar posts