Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello All. I'M Experimenting With Clearml And I'Ve Run Into A Strange Issue. I Used

Hello all. I'm experimenting with ClearML and I've run into a strange issue.

I used Task.init on a project, it logs to the app.clear.ml but it doesn't seem to detect all the dependencies. For example, in the requirements.txt there's the line hydra-colorlog==1.2.0 , which is reflected when I use pip freeze:

...
furl==2.1.3
greenlet==2.0.2
hydra-colorlog==1.2.0
hydra-core==1.3.0
hydra-optuna-sweeper==1.2.0
identify==2.5.19
idna==3.4
...

However, for the installed packages listed in the app.clear.ml experiments page under Installed Packages:

# Python 3.9.16 (main, Mar  8 2023, 14:00:05)  [GCC 11.2.0]

clearml == 1.9.3
hydra_core == 1.3.0
omegaconf == 2.3.0
packaging == 23.0
pyrootutils == 1.0.4
pytest == 7.2.2
pytorch_lightning == 1.8.3
rich == 13.3.2
setuptools == 65.6.3
torch == 1.13.1
torchmetrics == 0.11.0
torchvision == 0.14.1

If I attempt to clone the task and enqueue it, the task fails because it doesn't install the package:

Adding venv into cache: /home/natephysics/.clearml/venvs-builds/3.9
Running task id [9685af60892546b9bca94a660790b932]:
[.]$ /home/natephysics/.clearml/venvs-builds/3.9/bin/python -u src/train.py
Summary - installed python packages:
pip:
- aiohttp==3.8.4
- aiosignal==1.3.1
- antlr4-python3-runtime==4.9.3
- async-timeout==4.0.2
- attrs==22.2.0
- certifi==2022.12.7
- charset-normalizer==3.1.0
- clearml==1.9.3
- Cython==0.29.33
- exceptiongroup==1.1.0
- fire==0.5.0
- frozenlist==1.3.3
- fsspec==2023.3.0
- furl==2.1.3
- hydra-core==1.3.0
- idna==3.4
- iniconfig==2.0.0
- jsonschema==4.17.3
- lightning-utilities==0.3.0
- markdown-it-py==2.2.0
- mdurl==0.1.2
- multidict==6.0.4
- numpy==1.24.2
- omegaconf==2.3.0
- orderedmultidict==1.0.1
- packaging==23.0
- pathlib2==2.3.7.post1
- Pillow==9.4.0
- pluggy==1.0.0
- protobuf==3.20.3
- psutil==5.9.4
- Pygments==2.14.0
- PyJWT==2.4.0
- pyparsing==3.0.9
- pyrootutils==1.0.4
- pyrsistent==0.19.3
- pytest==7.2.2
- python-dateutil==2.8.2
- python-dotenv==1.0.0
- pytorch-lightning==1.8.3
- PyYAML==6.0
- requests==2.28.2
- rich==13.3.2
- six==1.16.0
- tensorboardX==2.6
- termcolor==2.2.0
- tomli==2.0.1
- torch==1.13.1+cu117
- torchmetrics==0.11.0
- torchvision==0.14.1+cu117
- tqdm==4.65.0
- typing-extensions==4.5.0
- urllib3==1.26.14
- yarl==1.8.2

Environment setup completed successfully

Starting Task Execution:

In 'hydra/config': Could not find 'hydra/job_logging/colorlog'

Available options in 'hydra/job_logging':
	default
	disabled
	none
	stdout

With that said, there is one point of note. I use conda to manage my python versions and pip for package management beyond that. The correct python version is being detected and used and the package that isn't installing is installed by pip. Oddly, other pip packages are recognized fine.

  
  
Posted one year ago
Votes Newest

Answers 9


How so? Installing a local package should work, what am I missing?

  
  
Posted one year ago

That make sense. I was confused what the source was.

  
  
Posted one year ago

Hi @<1545216070686609408:profile|EnthusiasticCow4>
The auto detection of clearml is based on the actual imported packages, not the requirements.txt of your entire python environment. This is why some of them are missing.
That said you can always manually add them

Task.add_requirements("hydra-colorlog") # optional add version="1.2.0"
task = Task.init(...)

(notice to call before Task.init)

  
  
Posted one year ago

okay, so my problem is actually that using a “local” package is not supported—ie i need to pip install the code i’m running and that must correctly specify its dependencies

  
  
Posted one year ago

is there a limit to the search depth for this?

i’ve got a training script that imports local package files and those items import other local package files. ex:

train.py

from local_package.callbacks import Callbacks

local_package/callbacks.py

from local_package.analysis import Analysis

local_package/analysis.py

import pandas as pd

the original task only lists the following as installed packages:

clearml == 1.9.1rc0
pytorch_lightning == 1.8.6
torchvision == 0.14.1

i’m also using conda, but am using conda for package management, and the local package is installed via poetry.

name: training_env
channels:
  - conda-forge
  - clearml
  - defaults
dependencies:
  - python=3.10
  - black
  - pylint
  - tqdm
  - pyyaml
  - click
  - matplotlib
  - pip
  - poetry
  - pytest
  - scipy
  - scikit-image
  - scikit-learn
  - clearml
  - pip:
      - sox
      - torch
      - torchvision
      - torchaudio
      - pytorch-lightning
      - pandas

i can of course add dependencies to the task manually as shown above, but this feels like a bug? maybe?

  
  
Posted one year ago

that using a “local” package is not supported

I see, I think the issue is actually pulling the git repo of the second local package, is that correct ?
(assuming you add the requirement manually, with Task.add_requirements) , is that correct ?

  
  
Posted one year ago

if i have code that’s just in a git repo but is not installed in any way, it runs fine if i invoke the entrypoint in a shell. but clearml will not find dependencies in secondary imports (as described above) if the agent someone just clones the repo but does not install the python package in some way.

  
  
Posted one year ago

@<1523712386849050624:profile|NastyFox63>

is there a limit to the search depth for this?

Yes, the Task.init auto package listing is Only the first depth (i.e. directly imported),
the reason is that the derivative packages should be resolved by pip, when the agent remotely executes that Task.
Now when the Agent is installing the task the Entire python environment is stored, so that it is always fully reprpoducible,
Make sense ?

  
  
Posted one year ago

actually its missing imports from the second level too

  
  
Posted one year ago
1K Views
9 Answers
one year ago
one year ago
Tags