Hi TrickySheep9
could you verify the fix 😉!git install git+
This is a Sagemaker notebook instances
Yes I think this is the issue
That’s great, will try it out soon (it’s 2.30am here, about to crash 🙂 )
AgitatedDove14 - tried exit(0) from the notebook and it worked
BTW:
TrickySheep9 what's the jupyter version / python version / OS ?
Just a bit of background, the execute)remotely will kill the current process (after the Task is synced) and enqueue the Task that was created for remote execution. What seems to fail is actually killing the current process. You can just pass exit_process=False
I just called exit(0)
in a notebooke and it closed it (the kernel) no exception
PipelineController with 1 task. That 1 task passed but the pipeline says running
Any idea why the Pipeline Controller is Running despite the task passing?
What do you mean by "the task passing"
This is a Sagemaker notebook instances - Python 3.6.13
Doing this with one step - https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_controller.py
You cannot call exit(0) and kill the kernel from the SageMake notebook
I verified the "exit(0)" error, let me check something
Hi TrickySheep9
Hmm I think you are correct, exit remotely will not work inside a jupyter notebook because it will not be able to close it.
I was just revising workflows that might be similar, wdyt?
https://clearml.slack.com/archives/CTK20V944/p1620506210463400?thread_ts=1614234125.066600&cid=CTK20V944
The console output in the UI says done
but the pipeline is still “running”
Maybe related to doing in notebook. Doing a task.close() finished it as expected
could it be the polling on the Task (can't remember whats the interval), but it will update it's state once every X minutes/seconds
Any idea why the Pipeline Controller is Running despite the task passing?
Having a pipeline controller and running actually seems to work as long as i have them as separate notebooks
Ohh then YES!
the Task will be closed by the process, and since the process is inside the Jupyter and the notebook kernel is running, it is still running
Also the pipeline ran as per this example - https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_controller.py