Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Team, I Am Trying To Run A Pipeline Remotely Using Clearml Pipeline And I’M Encountering Some Issues. Could Anyone Please Assist Me In Resolving Them?

Hi Team,

I am trying to run a pipeline remotely using ClearML pipeline and I’m encountering some issues. Could anyone please assist me in resolving them?

Issue 1 : After executing the code, the pipeline is initiated on the “queue_remote_start” queue and the tasks of the pipeline are initiated on the “queue_remote” queue. However, the creation of the dataset failed because it couldn’t find the Python modules from the current directory.

Issue 2 : I also attempted to use the same queue for both pipe.start and pipe.set_default_execution_queue . However, the tasks of the pipeline remained in the pending and queued state and didn’t proceed to the next step.

To run the pipeline remotely, I have created two different queues and assigned a worker to each using the following commands:

clearml-agent daemon --detached --create-queue --queue queue_remote
clearml-agent daemon --detached --create-queue --queue queue_remote_start

I then executed the following command to run the pipeline remotely:

python3 pipeline.py

The code for the Pipeline from Functions is as follows:

# Create the PipelineController object
    pipe = PipelineController(
        name="pipeline",
        project=project_name,
        version="0.0.2",
        add_pipeline_tags=True,
    )

pipe.set_default_execution_queue('queue_remote')

pipe.add_function_step(
    name='step_one',
    function=step_one,
    function_kwargs={
            "train_file": constants.TRAINING_DATASET_PATH,
            "validation_file": constants.VALIDATAION_DATASET_PATH,
            "s3_output_uri": constants.CLEARML_DATASET_OUTPUT_URI,
            "dataset_project": project_name,
            "dataset_name": constants.CLEARML_TASK_NAME,
            "use_dummy_dataset": use_dummy_model_dataset,
        },
        project_name=project_name,
        task_name=create_dataset_task_name,
        task_type=Task.TaskTypes.data_processing,
    )

pipe.start(queue="queue_remote_start")

Could anyone please provide a solution on how to successfully run the pipeline remotely? Any help would be greatly appreciated.

  
  
Posted one year ago
Votes Newest

Answers 39


@<1523701435869433856:profile|SmugDolphin23> Can you please help me out here

  
  
Posted one year ago

image

  
  
Posted one year ago

@<1523701435869433856:profile|SmugDolphin23> you are right. When I added more worker to queue and it released from pending status. However when I click the pipelines in the screenshoot, I could not see pipeline schema. It shows me "no pipeline to show" text like in below. Do you have any idea ? I should see each step box when I click the pipeline right ?
image
image

  
  
Posted one year ago

@<1523701435869433856:profile|SmugDolphin23> I have attached two screenshots, One is pipeline initialization & other one is the task of the pipeline.

The project's directory is as follows:
The pipeline.py includes the code to run the pipeline & tasks of the pipeline.

├── Makefile
├── README.md
├── ev_xxxxxx_detection
│   ├── __init__.py
│   ├── __pycache__
│   │   └── __init__.cpython-311.pyc
│   ├── clearml
│   │   ├── __pycache__
│   │   ├── clearml_wrapper.py
│   │   ├── constants.py
│   │   ├── data_loader.py
│   │   ├── ev_trainer.py
│   │   ├── pipeline.py
│   │   └── util.py
├── poetry.lock
├── pyproject.toml

image

  
  
Posted one year ago

Regarding pending pipelines: please make sure a free agent is bound to the queue you wish to run the pipeline in. You can check queue information by accessing the INFO section of the controller (as in the first screenshort)
then by pressing on the queue, you should see the worker status. There should be at least one worker that has a blank "CURRENTLY EXECUTING" entry
image
image

  
  
Posted one year ago

@<1523701435869433856:profile|SmugDolphin23> Sure, Thank you for the suggestion. I'll try to add imports as mentioned by you and execute the pipeline & check the functionality.

In Local I'm running using python3 pipelin.py and used pipe.start_locally(run_pipeline_steps_locally=True) in the pipeline to initialize & it's working fine.

  
  
Posted one year ago

Can you please screenshot the INFO tab on the pipeline controller task?

  
  
Posted one year ago

@<1523701435869433856:profile|SmugDolphin23> I run the code in order to step1, step2 and step3. And then I run the "pipeline_from_task.py" scripts. I follow the ClearML documentation so whole of the codes taken from github repo.

  
  
Posted one year ago

Thank you @<1523701435869433856:profile|SmugDolphin23> It is working now after the addition of repo details into each task. It seems that we need to specify repo details in each task to pull the code & execute the tasks on the worker.

  
  
Posted one year ago
123K Views
39 Answers
one year ago
one year ago
Tags
Similar posts