Hey, I am new with clearML and need some help 🙂
I am trying to run very simple pipeline inside docker container. I followed the documentation, created new queue ("docker-cpu-queue") and agent. It seems to work fine, I am able to run task inside docker (checked docker ps, and docker run). Tasks are completed successfully, but when I run pipeline, got Running status, but nothing happens. Here is log from docker
Starting Task Execution:
ClearML results page:
ClearML pipeline page:
2024-05-17 08:54:32,622 - clearml.util - WARNING - 2 task found when searching for `{'project_name': 'pipeline_example', 'task_name': 'Pipeline step 1', 'include_archived': True, 'task_filter': {'status': ['created', 'queued', 'in_progress', 'published', 'stopped', 'completed', 'closed']}}`
2024-05-17 08:54:32,622 - clearml.util - WARNING - Selected task `Pipeline step 1` (id=e758eb0f7f464c30a9bc61d32995f703)
2024-05-17 08:54:32,721 - clearml.util - WARNING - 2 task found when searching for `{'project_name': 'pipeline_example', 'task_name': 'Pipeline step 2', 'include_archived': True, 'task_filter': {'status': ['created', 'queued', 'in_progress', 'published', 'stopped', 'completed', 'closed']}}`
2024-05-17 08:54:32,721 - clearml.util - WARNING - Selected task `Pipeline step 2` (id=b433826ebe3a48eb96a235bd7b7ba6b9)
Launching the next 1 steps
Launching step [task_one]
Launching step: task_one
Parameters:
None
Configurations:
{}
Overrides:
{}
Here are my steps:
step1:
from clearml import Task
task = Task.init(project_name="pipeline_example", task_name="Pipeline step 1")
task.execute_remotely(queue_name="docker-cpu-queue")
print('Hello from task one!')
step2:
from clearml import Task
task = Task.init(project_name="pipeline_example", task_name="Pipeline step 2")
task.execute_remotely(queue_name="docker-cpu-queue")
print('Hello from task 2')
pipeline_controller:
from clearml import Task
from clearml.automation import PipelineController
def pre_execute_callback_example(a_pipeline, a_node, current_param_override):
# type (PipelineController, PipelineController.Node, dict) -> bool
print(
"Cloning Task id={} with parameters: {}".format(
a_node.base_task_id, current_param_override
)
)
# if we want to skip this node (and subtree of this node) we return False
# return True to continue DAG execution
return True
def post_execute_callback_example(a_pipeline, a_node):
# type (PipelineController, PipelineController.Node) -> None
print("Completed Task id={}".format(a_node.executed))
# if we need the actual executed Task: Task.get_task(task_id=a_node.executed)
return
# Connecting ClearML with the current pipeline,
# from here on everything is logged automatically
pipe = PipelineController(
name="Pipeline demo", project="pipeline_example", version="0.1", add_pipeline_tags=False
)
pipe.set_default_execution_queue("docker-cpu-queue")
pipe.add_step(
name="task_one",
base_task_project="pipeline_example",
base_task_name="Pipeline step 1",
)
pipe.add_step(
name="task_two",
parents=["task_one"],
base_task_project="pipeline_example",
base_task_name="Pipeline step 2",
pre_execute_callback=pre_execute_callback_example,
post_execute_callback=post_execute_callback_example,
)
# pipe.start_locally()
#pipe.start_locally(run_pipeline_steps_locally=False)
pipe.start(queue="docker-cpu-queue")
print("done")
Any idea?