Hi, does anyone know how to clone every step of a pipeline to a project? I'm doing this to save all execution results and compare them with previous versions...
8 months ago
I set the memory limit in docker-compose file
elasticsearch:
networks:
- backend
container_name: clearml-elastic
environment:
bootstrap.memory_lock: "true"
cluster.name: clearml
cluster.routing.allocation.node_initial_primaries_recoveries: "500"
cluster.routing.allocation.disk.watermark.low: 500mb
cluster.routing.allocation.disk.watermark.high: 500mb
cluster.routing.allocation.disk.watermark.flood_stage: 500mb
discovery.type:...
#run this cmd will get you one more default queue
clearml-agent daemon --queue default --detached
open one more queue. one queue for pipeline another for actual task
one machine can attach multi queues
I use default queue for pipeline and services queue for task but two default also work
clearml-agent daemon --queue default --stop
you will get PC:01 for a queue and PC:02 for other queue