Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hey, Im Trying To Run Some Example Pipelines But Have The Problem, That Inside The Steps No Requirements Are Installed. In The Controller Task Itself All Requirements From The Repo Requirements.Txt Are Installed But In The Steps Only Cython. Are The Steps

Hey, Im trying to run some example pipelines but have the problem, that inside the steps no requirements are installed. In the controller task itself all requirements from the repo requirements.txt are installed but in the steps only Cython. Are the steps not supposed to install the repo requirements?
[package_manager.force_repo_requirements_txt=true] Skipping requirements, using repository "requirements.txt" ::: Python virtual environment cache is disabled. To accelerate spin-up time setagent.venvs_cache.path=~/.clearml/venvs-cache` :::
created virtual environment CPython3.9.5.final.0-64 in 96ms
creator CPython3Posix(dest=/home/clearml/.clearml/venvs-builds.3/3.9, clear=False, no_vcs_ignore=False, global=False)
seeder FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=/home/clearml/.local/share/virtualenv)
added seed packages: pip==22.3.1, setuptools==65.5.1, wheel==0.37.1
activators BashActivator,CShellActivator,FishActivator,NushellActivator,PowerShellActivator,PythonActivator
Requirement already satisfied: pip in /home/clearml/.clearml/venvs-builds.3/3.9/lib/python3.9/site-packages (22.3.1)
Collecting Cython
Using cached Cython-0.29.32-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (2.0 MB)
Installing collected packages: Cython
Successfully installed Cython-0.29.32
Adding venv into cache: /home/clearml/.clearml/venvs-builds.3/3.9
Running task id [4767362c833b449c95f8739e2d7127c6]:
[.]$ /home/clearml/.clearml/venvs-builds.3/3.9/bin/python -u /home/clearml/.clearml/venvs-builds.3/3.9/code/step_one.py
Summary - installed python packages:
pip:

  • Cython==0.29.32
    Environment setup completed successfully
    Starting Task Execution:
    Traceback (most recent call last):
    File "/home/clearml/.clearml/venvs-builds.3/3.9/code/step_one.py", line 1, in <module>
    from clearml import Task, TaskTypes
    ModuleNotFoundError: No module named 'clearml'
    Process failed, exit code 1 `
  
  
Posted 2 years ago
Votes Newest

Answers 8


I think those might help 🙂

  
  
Posted 2 years ago

I think it was the "force_repo_requirements_txt=true". After I changed it in some tmp file were it was loaded from I had no issues anymore

  
  
Posted 2 years ago

How are you building the pipeline?

  
  
Posted 2 years ago

Hi JumpyRabbit71 , I think each step has it's own requirements

  
  
Posted 2 years ago

I tried them already, e.g. running https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_functions.py gives me the exact error again

  
  
Posted 2 years ago

` from clearml import PipelineController

pipe = PipelineController(
name="Test Pipeline Controller",
project="test_pipelines",
version="1.0.0"
)

pipe.set_default_execution_queue('cpu')

pipe.add_parameter(
name='test_parameter',
description='just a random number for testing',
default='42'
)

def step_one(test_parameter):
return int(test_parameter) * 2

def step_two(new_number):
return int(new_number) * 2

pipe.add_function_step(
name='step_one',
function=step_one,
function_kwargs=dict(test_parameter='${pipeline.test_parameter}'),
function_return=['new_number'],
cache_executed_step=False,
# packages=["./requirements.txt"],
)

pipe.add_function_step(
name='step_two',
# parents=['step_one'], # the pipeline will automatically detect the dependencies based on the kwargs inputs
function=step_two,
function_kwargs=dict(new_number='${step_one.new_number}'),
function_return=['last_number'],
cache_executed_step=False,
# packages="requirements/requirements.txt",
)

pipe.start() `this is one example. I also tried other variations withs tasks and decorators, but always had the same problem. How should I specify the requirements for each step? I played with the "packages" argument, provided both paths to req files and explicit req names but the step never installed anything

  
  
Posted 2 years ago

I thought it might me related to this https://github.com/allegroai/clearml-agent/issues/124 but I already updated to the most recent version and it didnt help

  
  
Posted 2 years ago