Model says PACKAGE, that means it’s fine right?
pipeline code itself is pretty standard
` pipe = PipelineController(
default_execution_queue="minerva-default",
add_pipeline_tags=True,
target_project=pipelines_project,
)
for step in self.config["steps"]:
name = self._experiment_name(step)
pipe.add_step(
name=name,
base_task_project=pipelines_project,
base_task_name=name,
parents=self._get_parents(step),
task_overrides...
Hey TimelyPenguin76 - i am just using the helm chart and haven’t done any setup on top of that. the agentservices is running as is from the helm chart
Maybe related to doing in notebook. Doing a task.close() finished it as expected
Doing this with one step - https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_controller.py
Also the pipeline ran as per this example - https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_controller.py
This is the command that is running:
` ['docker', 'run', '-t', '-e', 'NVIDIA_VISIBLE_DEVICES=none', '-e', 'CLEARML_WORKER_ID=clearml-services:service:c606029d77784c69a30edfdf4ba291a5', '-e', 'CLEARML_DOCKER_IMAGE=', '-v', '/tmp/.clearml_agent.72r6h9pl.cfg:/root/clearml.conf', '-v', '/root/.clearml/apt-cache:/var/cache/apt/archives', '-v', '/root/.clearml/pip-cache:/root/.cache/pip', '-v', '/root/.clearml/pip-download-cache:/root/.clearml/pip-download-cache', '-v', '/root/.clearml/cache:/clea...
I just run the k8s daemon with a simple helm chart and use it with terraform with the helm provider. Nothing much to share as it’s just a basic chart 🙂
Is there a good way to get the project of a task?
Essentially - 1. run a task normally. 2. clone 3. edit to have only those two lines.
Question - since this is a task, why is Task.currnet_task() None?
As in I am cloning a task and running it and in that takes without doing any Task.init i am trying to get the task that is running
It’s a task, it’s running in context of a project, but I don’t have a way to get the project name
(I need this because I refer to datasets in the same project but without specifying the project name)
How can a task running like this know its own project name?
I get other things from the project like the dataset
It might be better suited than execute remotely for your specific workflow
Exactly
Is there a published package version for these?
Any updates on trigger and schedule docs 🙂
create_task_from_function
I was looking at options to implement this just today, as part of the same remote debugging that I was talking of in this thread
On a related note - is it possible to get things like ${stage_data.artifacts.dataset.url}
from within a task rather than passing params in add_step
?